var/home/core/zuul-output/0000755000175000017500000000000015154566731014542 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015154574062015502 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000207070415154573666020302 0ustar corecoreikubelet.log][o۸~?By9T'M:`4m=(DܑEU,Jrtbv48iu'9;$~yQ3mX݈rx]˂7:~ ^.uͪD9ln&ˆnݖXqy%E8K祓kxPNjQK9Qqւ9͒;sp梀W<sM݋V=zF2q;˦ԫëyL6L]CD)`8j7+Zo^\zedQ—+Y>}t4֪]mRn=>BkKVxyOϢ(|%0Lt3oȯy"E sYBn.̦|c]f mu+5Zb% 8g,X=f(x0rL8oqv\9'ΞN0ך/b\z%?{dޖl%N 69+a?zr>RʫZEkR-ż?9F#doz^&bz+ toNsY />?Oiz7ײEsm?ȶ9hF(`E~`E29`(D}{ry+X` 'n(Q266xfŀ͡"S$KJ,bQmr)J0;TSTkߟћ SbZ F ,4<^Og Tn~Lbl& P( b`N qpfB8 3-dYfvӴZ򚟋_Ȝ)(*NO>f5;)X6؞8iR _W8Oy1 >OZ>+bН7E!yէ.VMk <|ͱo^]lsFY KO.ϧ- ;.ČeDшBL쾟8njn'O.?*2_^(>?gAiC@~o "r`"KV ]RGqlk)ދY|Г鼋SAdz0ܙ'ls!:c70E98!׭(`Fh@[4T( ,(bd7"1"H_><8s#YZʺ9uA !1QG]3lnQL3l9Hn Л zmX)n1*G,p 8 O|JkcH8CV8d[6WJ(sxQS|5ճּL 'cB@X&ȷh  ~ŅLb^UĚ{!iSs:[i1',qILp 6Fx}"YݹY}q_пOߔY}[ ϮE&-F?O# SbS_*E- >_޲r%Aolcd?BNu97]]-p, l5c®+~FyȰ$N2I(3ROѤ$ I/}K(S-Iѥ 5$AhH$I(Pc7O( _> "O$4Gx&vU\B"nqEBiw2Qc7ٔ$K*IpL4L^hd!$QԓԶhf{h1IϋPqJr<ߡŸzr ;8d\(Sy"g!2pv8=E1NRBRAOQC &LM) QqR(G{;QV:dJ,B p#%v ;B:{iȣ=F}i窭#*Ġ>⸊G!L}y9L}DԧјNPxJ q)9d7 FQM\}H1Ė5 &cic!54DiΞ0GWZH0;S?kp\w0GsF^4tpT0x)0ٓ<@#b!"1qY=Q+i(z|{h셣'$; ^|>t;ZD㷯e}5с7;@my-s"k wQR+*@׹ЀXuqͪInWb%!z;(xR'H4Lҽ#GsV(tnێBGS(&Z|6qqn5$O`KtC§iX:a̤l\wm`Zf t?fe.W]GeeO,%d[m_ñ1ɏ׍_ ֬>,l9簺88 my-;:违#W Y8:A;A)>*^gY!@>V:d%wª-$&.%jH[ &WLGMh?8ɡ~\!Ĉ4Zj3v/Od#] MOX}PlsC]o9aV 5Q2  @֗Dr&\Q- YE-q&>Qurr' $ Z w>J>V]CSQ](I;&η7ܾo0cy#3l.kp6ܜ,,5?؍Bgb]SLl}~ZgϾr`Η63M]Ւ[oEyHdpO _kz XI*]t6G9hIwvv?i dG0m&tD>jHwEoynSG鎸5wJ?JxWM8CB4}cވ`}>mೡ$'o3vRRnT'Yи6cbGMvAT׬rآJGJX 34EӴf ϝkKmSHqy( R_q+\К``U{D C6ܵPYSLZ G +~DtY<uٳ#j,*52iKfFl6& \om,tռl!DhXdR,n}hcGB^?I, 1Pil>isW]gCĩ >e&ޮ=6uO XBByp4X:5S`Mb1MyvibQx$*c>JqX2ܺBbhs:IR#+yۄ& ؆gZnAu^I9_89A}#[v=+3Gގ֟96#wQw}0eYZ[eE:|_dg>k 8Z40Ѻ (mtFFR̹jܶ.R,5պlci__ 懮RA`Ɩ/7`?801 $.LŅRS@]ݷ08&>o34xj4M)ᣃJ`>Ctn=uDF5I  hU.e=']gݪ`%?Go[1]=ֿ>;S+R)z Ә2J3]RIn"2J۸mo[rΑrL'4X/SpNf"ǩ91e55 ݜ|mm6>YÉ2]ȔWY^QaB!n֎Jw'| "n_h7,3ndoJ\W^)*\z7m,ɺW<6=$  WmQ( )4-M? ߻;ܡ;`O֥G]:HJ_'#`t}GbJ]K],:~X0a4䡈[uF6ݭұ=HFX !cPecxi3We FF4UJ&0|[«%_>{D޹e0ӺopIl$դABщe3L{\Պ2BHS };'4ѡ-+~kcTcSM#+Yŭemʞʩ2*.,7yk3qݪū\ciz%u ;g[u G͇W pL2n@AbM1~,60"ɸZ#uVۗ+ in['i`ڽ>OgDyo%8"ZZTbo$Wk.c|tL!5PIOo,H<ǏFY0ʔ/ch*Q0w@*Fu <0JQFtl 4Ԣ ǚsul >>h!k|2YL:jnl/~<@ ]|S]:lsTn jay׋cZJ-%NL(IN+ण'醎krK &Y\pfܩJὢt3otMx5-7h`>j#B36ckB WC!M)y"_n^!tQ⸿}cCD!=nZem  +}ь BOڟ,y&@a?g+2`h?j6Aj bP[Q'ӳBѠoB"7 VFP6٣]}wԼ,(u_7E:$ٖg\ۘ#U($wIE}c>U)-]^I6 e6EFЕ2ntKg*ԝ~)bqwT$IIlw9X?͊l$ǣ_ |qmzE,=eDY[c[ַIn8x"WJ%y?ώG?=)`qKPm@&_ @WZ/i}9(t/n6?`զ"7]@dp%Wf@d`Xd.vk!.XtÉ ~7\?Jw ޷ua΋EC&mu# 8QMtºHܠ s;8|K]@h+Pё@ i%wQ^MD;n3 yBMG~dΦ8 pN3,Ba]sPW b.@ۏ`<V6؀D]߂k:`YXdת<@1y\Dc"/10\ Aƚm =@180[` Pݹt+A/A |\7E t#@1j#-+"` P3kw(񺖫1/PnG:VASY"Cۂ7Ign4c`٫} nw*: ; aݍPx{ xP)ZN|`DW|j ;/!bl t=󼠌wĻLP7=?݃4+z0mً&ۑ|b92 .584xyJsk.!V0v(aJBpx'>}@:]$k{>΀CtB03%y @1}C`DG"DQ{t v?ewi٘hDg+g^|IH~Ei7I+7n ]@m[H;N dF ޕ-"(;+*l?r6cPA<ͽΙ9ѷO#luph?A:+a{~'ҾccgzLg!LҚp Wx [ ne`4 쮸| t&T&Q,6I-Ķhg f-e&m&{e&L L (\!$mOQ£/@K7x\³nem?1( ~aSLz*ƵIxRF>[dJm/ =d鸧hQ(8;G>aEUy^Kp ^W:eH8$ ZG[ }b<)oXvMG_p/l7EIКFŌC2II=+c.V4˵n` hOu;TߧY&!ᔻ={ir?q#_>[PQOd0p `9!5( ڰ,'Zz$$" |Cj[Rq_$|ZגK2r-'(+ȰhWF%=f7փɚ~˪{j~{ ٟchSY`*Cm-j.XϮ>zc#˝gp'ȃĎ P Ӳrz4SUc9y+7S*XMqxGc$"zӉ$#Gp|^;K )"h"=0b6,i^UhYaFKL4e[;]UNC~Q".7pm2>5Cۜ9$yqź Nc@oK8SG>j|x:+O {QO>EC`@4Ih]`i{ȇ yj, 9IRBmc^jK/ɑ: G~0(U' \ NZWWj_}܀%q:!e`8pذ\$$7Y.u+t&f/a~dzn8]!G0A7̣4π;`'P>X}Co^<\+}gTr%X!|x"!ա&s|'ʀ&M | Vw#cܠߑhpl[pH]r9Ml2"˪L0t! β.D6؃n*A6WsZ #BrfmXlmm>OxTÈM ڳ֠:!L dqa>:*noYA0b0=Q(0)uMVzE>{XoJ*ѝ|)(?)ɽ,mW m: t4suU;<~N$hkP pEBNDA(kʶ mO(ۍPDB6L)fN&8ʶՉjf2'wMji廑ʟH)߂PIڻj?P{ B&:FDB-u[nO u kmA=nzO$ۂPI?P B&FhDB-MB *HXC.΢,AX"t3konBC)k>6?jp})XV__^OkاñȄm6+o"w&H5G[*T-}.6$tMDEJƸ(`wW6ە+Fn9}s}w>#q޵`>YMSY>7HʔH#|$ёQz(orˠX`Pi8R#E ?ל0㊨v ȑ)T/THn;!xODi?W]D㔕 zeq3 H%x@oi<#İ(?E%4[8Wօ2KJ]t~9 ._*G37Tws\O.%YaOj9#z_7_.-yU=Ǟo)$<Փ V]o?3a<#xYڡe*?e&ӟ\y$C$*0o1~C4\]芘@}ee4\c@"5@A n`1fVaG= W,`#iW% Z-$SOg*`v-\s x.㣂( "Oγ3`ނ-xKDqQ|KH&l;o1;d+CE,Tb&6i S_fëTa~х7% eO\E[+G/|*ŞTU,p9Zl"fG|)(3FJlI2 IOy.F`H2 +)S&W#sm"pkRnSIy?v!Ht|;Y,O9yÖO,bD6D_I#@^пD3j>ArP~r\T`Yh\y`K3R@.A Âٗ{RL,7Tv(]ºkw[GA:$m}s W1n\ñ/Ň./th>'i;+>ˁ8Ӱd(!_ #0O-~Z`tbVtXS'F+sx?Vqǧ ŭ?q1kD8ɓ-\  5:/ B.Djq']}/7^n{J.ԀV  l`嘠uxm*x1CR]%~u[M-Iҹ9"59m~q5%4DⅦgKӈ.A "4pj5Bβ >CUl 8nHRy tdr,2#"MU+34f%REuN)1ot+Lu~&Hˋ=U X= FTK` ]HmM^zRj7$Z\Pai&`4.HS`'+23cǒnr ʙrZ\ gEYE.4U!uƜQ 'ɉ HsLxvҮU_R +TPLjX&⺼8<|+p(u _mhFr@%5Hw2Z'tQR,TF݋\U40L]Y{zԱHaF]Т}?,\aZ\3R^+/)d𺤸 EaXU1,A$\"(UW;%Ԑ]2f~}nKVmz'̫6ʠjZ}8i箻2L*udb$>򤷨Ғ7zyti/X,闌xkt`öNU'NA01͵7ɇנٍf:Zi̭Ҙ4uΣig{ZzNOzD 2yoIj#`Ku//+$C7#`A~q F^F-'5p y8tMAR3Uu釆ZՠoAKÔ)WXf95T*OwBjW@M!gP4OOځn!оY5|߃Eۼˮ-s!rzO Md#d|? k?eM?aEY*އkW,+J)#׫5VFZ,}ߌ~c L^KEXj]ɮ_ޮjkt9*[c3?kn~[xׯ  Տ?SWû*pe|=XWJ˟[I/]|3α"C^^ Q%_1oy{  "Zyiſ{d&9F2/yF(*0 Pyva[89rFp`'0.z&j\|PӉsQzrp; M;:%]ؾz$ l}vclT5V50f8x|L"O"8_WAROY_>D\N#aGco~w@pA_+sO}(zdF5,KZ:T6z?ޣ{؉Fh_7C~NUT`u>le⓭U|* sӫn\!8h4D\r0!%4uh=%(038@=d'< n0 Ƕժ_M3T.Vt6ϴ pkqī_~CC=5֕{RYoWю~ퟦ-Q_G!sڞ|bh}K__P5jq_:H]a3̀*١T~kNxI>JeO w>r2s5|8!n6CXQf}@&nlQK_wR<<Рn"+sL\*^!RxN)hRIW3׳1bORo4wpLΌE{`GAjbdaD2Lj5JPc+Ds؝1ȑȖ>&ON>6q쁣Qe4 XL` lS-D#pu[K"mf>ds F|&87\H/Yz"hq81.pTX85LG8BD*Y,T ِ}mdbODx`dI*u1bH-GҾK,X~sP1o TZ#,1r to }VDh.H8H}s[I%A$*y[ "i21KJ_jO&K&iNwa$]* `F9w8P wУMd6R=J I)V#F)Q#}3t_2KKEaHװ|PRe 1]L=|̜VUH1P|6TꙅQ!Id77-9:R`rxJ6ir^G})O^[;Q*;j[@#;T$iuKvΟTpKYIۄNk#+1!&l FҩxflhO-@5` +{ YX4&ؐpD I-!$Y ˰ xx"09P#.3H9Zr^7y}X?`;/n'd4s3H&k5V,qcR-9RjF~ovCD뜖1j#)ǼSJrm1cg٬؎!E]Lnr{;Op@xI\6myPg{05NwKrmr.jn_rb^=:zHsTԢR7M" Ne8oPW65/ aqJğ Y_m1at.ǵ,8mؼ8囫\լ88Ї{/Aw446B\ƶyFY$9]HsbQ}2L);e$>Hg2&Hz(ˮto7 (gSNOo-X|/K;(pƈyabU]{F:%m:{wSa\p[ɨci6jp]ؕ&w[Vng8@vGk5as +Dc) 脑tɒy#68vJ3G2%REElq(s1ҨD B`Ĵc6cE[ jyyI0dKԶvtY8M5B=e*u7_K FBh;W"[SRIX9r/zFo6mzPH 8]sp`Lr FzܽmFgqaۍ"u6bx3/:WªYF|I@aDї<Щ]bԱ̅JJCP X" KeIq& L!9\gS\{VwLh;#-&>R*7Jb|/>,3)67bTwkFwiJKuNNw3*߰9+}2F蓆ΫJ8=ZGYvOL k*oJ._xBB*(aY1EI! F9i,aMOuZ!0.v43wFpE4QXoQ1[orWM tVQ&uA67ohg?aG&9cN+>Rc5 EXᚊ:VMgEl1jNp[iyji\sݹHy~+ቓPgK#_rc1޵mdBm~0[Ͷck C,iIɉ~ϐEKNDF 즉D s<蠢kV-ڄ֬+/чHmC9cM(_!p=_v-hѦB&eb8=$% P$cz3srr׻nu!0Ap2blOb># rZ]+4rnLn$ (Iӌψ4u>枸P 70<&woJ/8D0cRI'oGe4(JJI_u5*(0Al"> s8x{)^b|/"=곞d8NNfϧ$=G.w7JoXAp ^Vb* }K3Qom[o_s# #уJ4Z3VF[>\%As.aLȔE*)ֱol,-s ی<׳MyG;Q7 X\36|[D 89÷My: HV:o4c%Ip02o3̭܄ 11ltK Bc!-S6yQ@0.AZ$*3رXvH?( 8^[V|, 8s0R(a",11 Ty1'˄&j1L{3@@9`%2F&&S&frlcF"F#qNa$M/Xpwȥ*ife}0Q?_4}X)J>[ZxWl],%[LHUVaqbj߸{9} x97",!Spm\f5hXm m*2mZKq$RG$X :Xe2#V.6#n5G[[ WU76}]'Jnz~[\ϪpWxݿצw8d]]+k'],U>IuӶhksVeSZzNw9*1˒ eVmFí'y 㫂YgjȰ*WզCh9-W)2dħ9%T9K{RV.0hP Uph Q^V[_ a^XZR\]n]t s@+e.L{mLb*ڶqT|"ȂsTn:K&9H3#4)6ϟ(ފS@Ha>=:C KHJMEm(!=JǬ3luրljS{pMՙL"w+c+8MVviK YW3G8hZ&6ޭ$BvHL]P[S_+@ tf.KZkef -d#~E&׻e{n%L .G=' u 0uTڤ_Y{PXp 2PxM4TÐjfR89gS7#Iz΁?1bXk8 D4YK-HZm M&6J6=ݔ#jRBX"ch\_|pܳk5iS i7QWyAf#xSLU̸qҔDRJ[&Q9wI1u`YKX3IBx‰#4Z]dMɋ RY E=idHe16Ro3R\瓺R0N|$sthk_I哼Udc+D̿|twGC.Ԃ5-K)-О=~L*^Oyn_FݒbYNǧoX)h4n+y٥U`7[ep!_DKnKiHfn/L<7#?#7Ѵ7sXp:_ETN gư^{oE '>u{Fr+M _TA:Nĝg>teꪫ(NG7X+&ҭ z]Aua\d3vћ^ڊ8K'R,CF+Yz] Ldͧ㼜G^_al$nt1ʓc};QEyK,WKĬȕߍ|~ptQX+|ESntpETaD="V _<'w\ަOh~\bZ (mp>='ٴ3q~=ٵ DmXմSdk.ǣ^VL޾w#X˻QEh?Rو kp=n֗u7Us@T5.rx g(5sSQ4 WCxL&T,qbga`&}+(Dnwn_-+V<'*/e1'6MΖ߬^~y}yJY䆋Y^΍\_-lB4+pXGx57ȽQi\zo*>BX?)V%;-|\\DH].;fg86[}%-_z߻!'Tw)X;Вϒ<-_ZYՒt3Sx[ciostÅ'2p7xr>o~_uzI_R;%Aq uÔ[(^8D{-ZF>iNOv_R`}F3"测lɬPo|cXKS_lnՃoYbH]Tۿcq\gܵ?.!PWFd8 [X"02>N3ׄbXB5>PjHBs@ZЬGZH<$3#@b,~@!Xm7cp0ap;շ" jr.,4ؙE mo+o@dWu@bf+Z5=;M4]O1*E_RU!vx!- mB9JՀ'\=JAvl5|SYu$^Udwh6ޠ>$s=z[7ӲNY0RGpp0N !3|A41I0-Tw%X>:0xHGg>J\&-*6cFW$^!ukQe(3vT~zOeU+xOk ? mڝ*4~{w~sE]p+]¥۰U^߁][u6!@؝]#zwǪîl 4U#C[8&mig-vsU(mi~OD fd{bIRˤB*f,ny ܊jkO3@XT!;A5cCwlܔbwA+;tx!y'` M4K% ܅OpCځ\v;`:8J{|cߧT}. .#ԡe>hבC7`I;! ܏޺KQ+=ʳgYu.N{/~ON 9W_HQ@ZS`%}P]ZCӋmpr@@+09;1 "N0t0;6`?&֩1jTAݾY?l} p^?`'5'LB3џ5VL[l#oX(kgxpe 甔3E|(~+:M<m?q~'JJ{.1:(FS4=w-A0'"[E#UKYJC7fώ,ؗ >uC(y<Tqq$R ,XX˕t(`˃ bJ>^; ۯ/(n&<#0*4$"Ûlyj&Ѻu hAD^,Ŀ_zNPTcmʣr[w8Y%?7!vUWh[YdYF?bPa6,C$c.(IYJMV* -HO*ٍM(Do+yD}^l@=LZ<< ]a-ք5ihq&RIMˇNomzd"早b<9]CEo=~aJ r7W)Rx@2m2mӰ.5g*FbMV} [o׉:oNF%/M|M 0KYfi@bd8Jhf3 s@XIf)]F֔uNuSF`(c I:TȄ(K2æV%ڂ3B8b ]8II)Pk:Q':y˃]MpJk Nd) -O9c5&3ژX&N:&Idk:Q':fae[e[DS!%ndc nY%d>}c5 yIwY[W|zyL~5/)G諕7W)xz|tbMiuFk NGMcSϧǾ!aq GG+h́㜎V`cW^̀gD/=~ ݳ~zZMzs5({ R6Ls[g򱬨RQd@ 5'@<Z Ñ^ ^OO,O~d~%$ыCV::y j|U  . |]NQI}s]~6Put?{WF&X[aHA n6$܇łM'%eg7_%-ɶn)Vq$zgX5[&pBM"!3z i Atk-Hb3z~1]r Qꡝ!xjfkOP%U[h0bʤ`mR1m)]Q?S[U\Խ͵x_\ڍx5V|O?h fVӃbzvEpp~RWR2GK#1Ï\@^@u|G3GT~,If̡o+,JL:]zBE4%E.Z1ι)Ҍ A[6A<8:ĘRD.>Z T2b4e + hCPcoՁkU2qCrhoGÍ-N*l%Ɉ5(UiIo2,kK J2͢R0J8A = ڒ>vs>(ek}؊[3g \ =R9G cA)piSؙ̰vN+@"C[.$ :pJG-@ lKvF-XV֭)FY+h>pW6E)+zdSQP{kSjEAK{ ZjCk0Cmc7W-n](9Ʉa5>4]r`JeB`ٹ:47VDK{(rxQEistAC%rr\Bdѧ=+QpӠC ɨ&$WF`Q{SV"+ťCQkˣ v$_1^%xWIcORK05|o}uvj%˫#(f|]n1EE|r4~pD.<`N_v}a Y=n;e- W;o\Y$O IS\FVphC qUdXF[BHP RPyY3rZhչךXBJ0%*'92ډ f9HDW;Y0E/`"ڨEzzibv(q+w:ܨ77j m@7 ]˃k;~er7@#n0\khUج }?Y-mF^5F[+),IZl89K:Я9^js7: P6+}yG=dhB3+0Lȡb Ony=ʏ'$ F<J rkz-7HnG ,nWa^lTh-+bo-{+?MY{^EJH')$Ha+Jh.UP) +? lYBNjhZlSEɯR`T,ERW緉N/@#gFȁ^QpmzW7z%/zТ>("eөVA#"z2'_yBj=RX& m_4@6Z3CP!|sy[h24Ї sZ|BRŎY0ygפD0Q"vZ* 8σBk7(EAfЊuG'}Db[YjB_ IJrf`Yz2 .7RX|ڦPJa\QMi Lbvuv.>VQl4G{&|/Wfa;<j: qs`?O;o^Ww\ꖡ&VoH/Ԏߝo!UjxBHz,6z|t?֞^˳sӍY\/ԵrJ#L.ϯhAN;}?] OzR _]8%/Ov +x2-OiCzZ%V6㪦afGFǤ|H9-X>mhb,Q]nh=sQ_7fkr<5/olz/cCpmVI@C4[ =١wկ1>Nsgsv$ϼ~bQtDKr̈cp18=5ONN_='Ίɜ7OMZyÅ{;߉篦%ܓx4}3! uwj@S'RI+: lҵȭmWϛbv.fx;s [\6bXLBnc] 6єHޕody≅|T7͟ս%U1ysGG һח>}>>HiaU8Dd4Г$>d:x6Q @@顧dɠX̪z(z3@ш͚m/Nlrr)gCc zihx07^˝߆Z`hġ)$evLxNO_^|\697}iR5Sy#D91)SLWBچI}U/I~|j=pԎZ8-䦐 Cn}9p*눿=z%Rb!iՠqSP)\;%9%R:uŘz6\lnG]Ug_rKRMzI'(\T"S%n݌FQw7}q x3VjC|S RՕ󂸋dh?X}_ ]^|'~Xdjsbw}obmz7QM_xFRU,/`h=/wTd_=d|[{ɇ_; /O5Q2Yǯ1ŵOotfxtS*.ߏ?oSQި uFy& veTeOϦg -:K k5wN =ad5l̐(& Q? 4?- lJS2wqNyG18Fb?V[jzMޜ2- Vs )QxAH=7BW2H0L DRh9X&Mi%7MD UOuK^ y{ϧWݼV7׷׸>ɼxNOgO+O+^-2QLX+QLbYbZlV>#DaN4TJ9?}-(}aOxy>#aTUg;Hkh,oa Sv%W!<0x6m/SEr| 7yHUߠ:_=pbzXLl(3 tYŚHcAF2!*U~-`OW%BoC,Jf 0dgl{$Cgr{0kejvb~ȴ*PEj͠H(H=H)mi@0 FZ4/UFoM jPlIE1hH4|D*b8 Ao3TT7./=w۶/SјL=8-wl/jE 8':b{l/Qtg{KU-wm;=(S*'2+ƓmD`-#R J޽1TY;nJHb{Z)(g-#SPlog{v"l!'eno{k{;pԗUPR*eC{i{{jHZ909[b{?r8,R)'+b{j削iV|; Tx2ɡ}"S LgHHn>-'ppuAlN톲GZO c{(n~ˋМu/Z>=,}8}l9n^U_csQTp܆/9QDZ-OXz*C]Be[eĭa;BgsQmS);[ֻ.b{/zئtRFB\Zj!l| {CtyRm0u]wxAɭIjBWԵRQ\[3ꨂgj~>_pOWձq_xPj7h (TAgUi< {-> (5vnzPmlmzT1{Yljzbauu1=U|."'$6WN3EȁquesAbN?A [g!#s4!q{W4ƚꙞ_FLldgNi͡.{s7^ %" Đ< lR7!H>a7UWtjHҥ?|-,A#pM,=I>9 P W&( Z@@Aiuq~JSk+h źIps-a$-S5tGpW}ɱ ecrFd+qj4T5j5CTvW5a}7HufzGHJo-98FFI$RxIu16FŧF1636 Xe"D8g1=ug9LCX|b50Tj0߼=YT/[`"4GnAiT+2f4 x& @T0$KJzÂ&+ߠ%x톞掔4K0$%[g}-<uJmqw_^p=ިHJ0VON$Zο#\qrԒ;L;# S,Wߒ#Jpd܋(kjUaN2ΌRҎWܯ 2Z'b4H>𒤖-$5ebr~P?$2[W41E!4dJH|윥R-T;ߖj˫ٕyuw Uu:@QjFtLF[tRo<:TFIhl (iؓ+:bB,aP0SBn1ANIB|`auU&dg xId+-/s)NvVi: ΅h\==Z2'b@)ʢ5ف!"uJ$xR\/)KSĊäkFO& .DBl@`zAP.T@,Lߪ眒"U  wmhZ9mT \mjrP6ʼp kjVI%ф `5# U :nJ >U~r$0Sh3"X;ܢ+URtW[\Ebfx@Dqyǂ#sLi$tmSv:ٳgO81zz96j$ɇb?L RT7tnC+ֱnO]ɠ AAY84]K`w^NofmmF =@.cdd3R5Fym[> c1*tMU.$SAD?)XNJx| |0x!W<\Qߟ4R&YR|(¨u0E)XLCgD|ש(~a."Ae$PUI&I^X}$ %Mcʣ&bDU)<`HC]j0eE۷DATI22$1!T!c+ւ!d~ MnG)cSO,$j4H6 ITmnKߴIV)H%#E<uN$ɇFTq%~Wb)ײQ}xݎNj#JhY$8y_ōʍ<fĠ"#W*VU) (])Z+QW3Bq8o/s~2J`}(38P5$T TumCڂ]cMA-?bN8v8)&!H>0WE ]$TXR? 0ʽr N۞@&EAQ[(~`Cm]*~`g~SJSRqc"!H67`Q͊H-w>-(iS#ro,:*YEy=œxJIAaEeN xQeg”$ MS(b۝Ξhʓ7>A1ѱ8\WSd,\= J/?뻍{N+XhLjԵ3Gw ܇MA ..f%XVbiVӴ{_+w oWp rt<eQ]\~Vpcgs&&WW=*u8[?Mg,WY >}YZIrچl0:d$9ⲍz}7~~ĬmoG?+Y-y3+|^\~};^8_AL2p7M~u7hR{u78 TRCy1;bPS7IA5<~J}7"ijcX]3n~-f:9ouxZsN^y9!jOw?YF,A<y}}^| wO][W\oIwjK {~)^ l@o2.e{Rlb(}AUe T_вd,j{8 phxB|YrUcj[<4R&E JӨH2\$)b/hVo_?&"ĥ6ܘD7`v/nᮩjh5~9[nj_,#9\/7r\s!᧰}Z.NOMu)bE+|)TgO8f$grGdϿqE߶{&G(ޚOwmHqwe@pb{] e1b(W _-ɭ-R-K-vHXUdZ"ZYAſgBa18A>Р-J]$<|^<am.@">P%qV@ 1@ * ;@X+C7c=G۶f%V!uJ|*rhHbpZ+۔Q6͊޶I5Fe ڮT6O*7) O0gZ GMl eOᇃpʺCmjSZ9M<+Koꪪx(Ê `f׽3-d 'Zݢ9]D._q_O?]YOlCcBfBWjk_4ECh< p}_QtKW0*ܪ ΥEؙ+W)\2 khm ˄u/ T,p TiifܪC'$0&1 I`[`4Z@ni0vF54m ɕI|a;=ȕZ+ɟP=F?wQ`_; y"[B\ceM5i [H>JN1(rkUV(9L3ҝrt/Z]aњ ͔ޙU= *Lt xk,Gzu.;& 8+u &_85$H\]?p+ A#5/!JWvmK儓PI*6{AH%{ X鶞QHQ$fO!8 @Q uN @,?DAk}9D̳J,zX;!v‘b 7EJE!#, (z'"XX&(Ytơ5?--f~Η̗%22BĻu zwxDk$Lb$<[peVVS_4)=CM۝Ɋ_:^/sxq;8:AҵEKN}~c"<뮝ߜD[/~ /TְMM&D  XgY3Hn q?ՋdZ=z7u" nﴶXw mwz$4V|ǵ4mmoutU]cyD~˃cvEaBBQG'D.pY}A輝.R0L+pd?('ͲFf9ٖp8LBQK-pв4qSk7::hZҁS@0P&p\Bw17g1޿| #}oI߹nJ"П2_ssMɧj2f^mRߛ} Zo[!A[)jhKԐ \E ]&poGw#oy.*n;\AӧlK XZ/l[Վ{`☵j~lf6TgUoR+7bقSj˛xՋoMMc !6Y#Aqiaz(t`ǝ-߮Y 33% 2yId ԕҁ?}]dG(8Z"z@pK|QdJ[XIj9cE$]t"-fYD؏J7eM|5'ͬ0`h$Zǂnɋ4HL|S&ߔ2*(a DVr|E2L)dL6E2d@tPcZH!4<$[.,((uVf$0,W熴uahw-,ؐKU TM ]O>-EG)5׭E64gv#!0yh`d=%SV$SiEg #X<\!ŭ8ju )\ "OM$N[ VDtx}2D8O8ѕ8ީn$JZH H}-l5C#: NٛT}8>@qj7 jf:~ y7 tܘAU Jǹ87qn FKWIki@\܈]0dSY8 wWm]79!a>%aܑksa !c aXJ7EƓ0D 0H JɱSVbC!Lb ao7:nf!xηN0 4`*U1Ь>iVbYYO{r$r+[DIxcڶq*nº8jĐeH1NO7M몼u؍|k O `J ;75K*wɹ2C aq#hѼI T1C}lz7ΦӫޤVo,׳#69zq5Ƀ1X+tKdJ$/GD #dqwu a?w߻3 $JSyT^\˫{pJkVdVVGL~QS@9wZK ,, W%Z.UY –]z(l6:"evا5Qd ZW:֢e#\U:=tYT gSYҋ>s?cJjNDrmCT:q)d#k2,5MڳGݾvA4ܾ> \tF^wnOI*uR_"85'%A5'=MGݬ ryOTBe՜$(D" ؑ1f]_vu|F$L';.K1_lwAk4]}Qљ+ ZJ)@J`ँv6nn>zJHHػMf[}76{xdvZ@Q6O95ed64cR|g&ӾӢn4(-/8wJN˗><Ȱ+[ք`'$8) X~5^rlOLUuJ]ĉ0͵&}6F;c-E%L n#X-3qJ֮eGOLm9L-cݰ d e,3:p*4: <62 Xh}C"XڶτEU>p)ey&,ߦp6uy+>Hk]RT"G1/ xX~vB Z6bD-G}:/+j+.%}ɒTgZg8(EAz3>>H Y !o>|O s'P.~Snff&< ]7"9>O:7sKA* OkG@DeTߕeh71u_lk/60KzĶź||-5{uozgWsز.ј?e˛aӲC<"(*i]th *f[Yݜ8َ?-'3RM(mBvlJ\E$,[uds3HcA!X޵#Q]Fn#,3[hEX&X dR6%u!"bK kR}aƩxQxbcWzk#Wv$ScLS)vj@BQjny1f@6̚`K1p~ >ғH"bK E _ 0@` )*j9&v/ Z$KC;]a!5k-a.\m]ƬJ<y1LRPtbcW/;cMFjA6LGԖ"cœY5t`_%y C)腘4ԓ |:M+z`QLE4pPm=# ߂ xuNJH`Ꮵ"XՂ}®8+v$)]E  N+h2rAy%:t.޺1l4gZE5]P%X?hࢮm>?u^Vg D`e#JG#mqV־v9рcsD2×^G9L1őP brœc *$<\`NAF*z |_-h_AZjgr hsƥ7A~1 I`e,PX, S;/ :ْOs7**&96e& K)'&Š"k{ a,(yoՌF0k pML`GU͠kDEpuf8,([N("cpTj-(Q!P]ot]a: hLCB. څj,gGA Pňm r,95:sUmfl=rP4Z!oioeϤ*CGEZ6;FƞT 5cAٍ\gzA' 2Mq}l@]-'%nXtY X Kȩl!K~*u6<0rb$Cѯ3B>I 82$fFW8\H5*Ib$so~9)PʑWbr[D6e&f7WH(ᢌUSGL蕉>VxT~M!+"Zh 9ч _ id1z 2 'h.ݣ}XٱШX@AY:#E9X\/[`|{`ur{~~.Nw=nmj$:Oj2!Ћ$vS./KCm sgB)%+*XbS\ˡ["YwZ Dl`6F0. :P\lٔT#~e$yݲ pb/Hɒ ETNI6 )VpVךlӡD?MeP~~5y {=x}ʛ_ؿ立4a͛7w6f(.]a+~^y=? OFtq16o˯zf}ݑw=X㣧Ǽ''{} 3Feb|?c~sGI0^>:uċ5[C XA܎jl4C례Cz(uCM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=CM=Cz($yޔj4pvP}AwGC&zzzzzzzzzzzzzzzzzzzzzzzz[ -|0d\CC^VGC~7PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5PS5P?cO\ w"Xbp/+7zW;k'}CO!^݄_˓zvu6#ǰfF6FMT@n,GXFJ4QۑC%S``C$Oo^_l'垤T^m0ƦK쑥σ= hF; U7mtry'W=pU{>zoHƵj(w8;'&n\^&P1gU߻͗'k>l$Jۨt#` medA}0 a+4(Fo9l<lpkb6S̆kF:b5[N cVF6tX3"m,VR'v#`4X%8%GF7֑ bJ4䙷`09vlc~#`7VclXˆx+xlXbU76<ض}d305gĎo,3kEu+5 rw`Ǡ[o W 8*<.7D.`.5Fq7>(F݊Yce3 Fzi3`ܸrMz XE ȕ`D>r v8Xcbv#`Q*v+`c؊ M+u"'֋ۊQ]Śd{ X{Fz1l,cȢ^6VģX:2֍ ȶȑW݁` B;?2>Quo#o9,ֈX޸u:nx+4FJ=tgd{3-2YN6V8x#`u\k=Jq+`lf}|N;{ȭ? Okrτ+=Bh76_-9s>#4zO/l}\k}>jK>j,?V?Nj?ċGc$/Gψos:ݕWoAWݿ 6/FڋO6?M!]⺤ݽ7Q^y@tp"g{ 7w.NoowNsjQ'g%RGvΒ^r~W,c?}|pOM3:v7a͗aHsl><4| & q~}#W3}J(g'^Hh\%m(d"}ܪ-ثCI'ԃl)VSغڄ穀ǖx,'K%Y,;clIR{gv8> @3%;9,yzvߊd-ULFsj0("d2ӕџ3rnZ#~zLTZo9/ݪA\]Vk;jDZ giJȱFKTx+F+Tsv4`]#XzgG~oCcjBɦCmXb lʵQ&'jRF?!TG^et(qXl9rlȋb?#AO W+À? Ύ1c?Aj$R+RnHQTK8J/eܲMG Srv֯뎡7# 2 XFB۝dQY~PcBPIS0*MU+?*0A' :uX4^(Bvi u8jDQ~Dʲ0c>N#-M(46[+mY8iR\[ÑhЄ\0{B2 )7b*L' n%~(9C :K[+z`EMi02VR2T)3qz0h·iaur_.s XR` Bu`6AB|"h*Q&c_d, 啍<>JOLG-$WB Ń"NQlc*OeߔfBOSRu s܌ڳP3"W1{Sˉ#6F$/rj >E㯁l뼠bTDm:5$kѫ5EGa.x)+t8Wu1-8)9ibplH>6j,`b *5 e.x` K@%:|"`p:yTжJL ջwFUpܛ9s zEfJ3n]"$%J&g MRhAU /TZ6XfĨv<8p C9)腤:T`\:͒*Rex)5E( IY0*TxW|IMa^N 4c* T4ˎ ]Y 8+J) 吖,o2ʐ % @#ճB\NTոj D1iନ91맊\d*.$o[U>Nb%f fO4JBeKkH]b! nH^! TB+mBT_D8H&a?2X3E(ťt؏VY{JK#m?_U|V'pfQsH"W Lst@g5&x'S.',)qXzM䀪f ڗcU\9gP#F;OyEX_?HҰRبU<LWn"S6,*Zf 64c9#r#>lpDy٨ta2 ddh %d#QRk_/Z`xUaXk45'V]4([ H<3*9'A,hʐ%(pf/ԸB#TP]PU9q.$6fy$)r{*mļ6 >nKsYZiH.N#GC>P@0\y@A `6af2?*CDŽy#JUH* @FY)d,ѵ :QR.΢hN+xg1RL#Y@D5ؼ/Saa"y;*C|&0^Ɖ$"Ì@~1& 92I`vcTUv8(0|Tcm2UR+JG4HEfG+eYp^ҪaD)"cw+9e~@ 9 G*TP4a. "Óᢶ`1=º%Fy* $S-*cȄYOt $^_a+8>l^ 9|)E39GdO2GZ)+@J^ Bkp]tu#q@/G\$]"W = ŝ\BB(R)J#.t!'&eMYt'j2&` є   ]3n \+*E]5& ( g2iUM[bilH̨F9cTN (`R}I!s+<. My)ףf9EZУ\-=aJDR9)#UcNJ!J;eյ K4p fhi% $0"+7A褊tF$G $T5r\1rb}) 91Ƽ|LQpmE &*8Fq%\{&hWAthz 쨋:FR,'Q A] Vc]A Έ.!ѳ Qр40/b."J$g*XHP`p$0HHOW{h':ڌgr/r#-=39w@M)R60_9<__~fE LShU罈]He˝P=lRz5J2_R ^RP?5R(mJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJyJBIjKU~g:/ԁ)uޤRjJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJiJyJPkkvILBQ\wFk믩+-oJZhє:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:MӔ:Mz:7[2~{~ T#ϗat߿SNò7/ޗ~ylvw+Kl;{:}F?I>n*ý6$~q_G,/(̇<\^Y+R8+7C |;CO;cԳuC;B\j bgvqZn_;⒕'5};K{ĝs!KCHK;3HKz+H ksDž푖!DZ#hH{{rlZ$g1$گ3v2n-7R[%s-Y{[onP-ZIy;,uB #wKawK\#vKac)Yl7^how 0̙+Om+pE֪WOvuR `pٗ g4GY/WR.v%jWOz.+̅"s"k}pEV •{CpE`p3pEƼzvc[tW)fz_utSGUo=$7z=rGOacCjB,{lc7j2-ϟfӽ?Ó`&?<.vǘA?:?v_۽Zڧ'n^fG_T`I8I\;n9UM\t֫\qj/UutL|v޹kWkt.n1=y.tKxw.neN uXWф\~:]+wXޛGp,| cs~t3*~(emf=or.C_lQaҹ;[?|m lC,~dYaSYl SRDŽ`qo6-7>E4m}=ޮ?hWnL\2֏'(z3͏&R0&( ]n:#k΄N:|u.8㽊Tg \l-^yA=1ccM2&hcsIF"o=S\z[1*dO>k8}R&0IޥŖB|%x8ݢ=mlJdm Wk#YmTk6 2 fֽ׭rԫ9ٲ=jcy IV | ݠyZkh4 |&Pt5G2)RғxTJӁ|+o E7-ɟL̼A1 myM m4bT 0kZO+iv_q+ῦB6b5{/WخVC$rvf: k{Jv]ҹ }4lq+7%^m4G>5vuUkx"񴧴I>GEa|uz:ϢZ<\e(WW/L?뙸:M o1)1_~x{C/8ygayQ6&0/s<\d}_g.󽏓i(+wv_om`nK/;"-y`AkXs$&5^7kc6 (ߠ!#դF=L^V>My).m:N2K};=?>u8AgC_>ʏgn}gDD P~!|ޡ_/ӣts)zm"7V 6nozmjew1L,_jͥYޢn'Oa~}/1ך~6joeg-NgEtVU6!{._@-Ӄ[|K_5*g )][]n#)xtwb?.ee{Oj{?kQ\j<)y׵/ctc^-m-lnYB qk)j?ˢf{6W6ݬʀd1`c_<ݲՒFRw3wW<$"xf8Dbի{7V]`F!ԢbN= mlvulSjOj] }]MaQD׎_a._2K&QgLkiʔumfl6f6Fh3mf/fFTξlm:)!*d2j2{&4n0!PAn "q(m(ꀇ6$84`Rscs nw@.M/>Ͽ6Ēum=sLYۺUhQ:K_-gA0Q ta?ӗ?Tœ`)oMQGtόͩtTA^S\ls%@:pHlG* (Ԙ4\OHKM5Eز67UڃSdRIżL L3Rc2& eۘ£e{H+ǞoIVL 6ڐXH2(pzF6"Q=r7^Yy q*D{ a"$ !"#ס i.u!M*zNh9~XP&r¥q?[ a8f(ER~h/`TU; 9!ư]q} iK-m\[qwo^#nT%}IW`3?m3nX@I&/SV9la+$[2K XKIvJ v"|w^ 7>?B2j&9'Y&>T'k+PHWa`ː i*᜹NT$$cvYьIvd>D'D&ee9wrz6ڀx=B:椗mDVrI_69ogusUnn5_/0: hi+~<,bs̭_͈ < V[-lj'[ʙ'*~[M#=G{o=\7}x&ď60k95駏?=&3E>>:%gdIUqy0Ah7TTNj~s֐#k*uTH?rKԾ\Uﵹ!Ld惒b[klVt1*>[tݯٿpD&= QؓCt0%],gz( gn`h#P֤zP3WRCN<ّ0GM`ZʑH-Q$9Ƈzg6( ,!tzYÒˀa/1RtpdRVWsJxv1L4`bݎiF:vZow5WOq*wV:/9$LJĴbxf i< ʌ'C %?d\Ƈ$N7`TƇV笿>,ıWxaM8'gSH"˱xPW4. XM:1p^~2 me  A oLFE}6k^=]WeR-*.95%Lw3J^ǻj7Es2\ۓ2*`!̋,As#vҜ!Ҏ&VI~ysz8L6f2W/cJA@ |G*)sI#Fdȱ#a̵l07.`ˋ fYMө ݉Bg>0}3%h_Nd[uJ2(Pd9=^}?!\nt1)WxnTv C ńy?t'^).Q!y 9\P+\H*9^+K8]jr ZP6Ռ/\ Xq(C6,{t+~WϥXޚb0C_U/{;l&g^N jq4M*6W&{R$Pecc\TNdk^y4`X5E[]˥ԑ6ǜ:ţ 3\0!Uʅn\u#tEem̑2s>ETPL(=EE7&:R\tu\m['{~ޛ > L~?Ir++&Iʿ0oPvlLmoWӛTlLEaDB( k@_$V85`*Ƹ aŋ]㖌b *+J W@2Ǖ9gf |4-a3O:27jUQ&GӏLz 6]gřAFbAה &HZ&1Xee /LBlU`uX%'ao>?}5FF1u$ـxc_tPH0G2uz@}|={F獷[||ώ a7;:g ^F#Q~©=AgWБYHr2i-Hh Ws 0 nwE`S,7 | ל+xT4!TbQx wR? &<~h6dP7L#ྣNM3Ǒ]H;Aa;Nl% E>+5++$G>Zx{fsR.tÁ"p>W%:%zfhI3Ԏ >x\"0lyAͭ:4%eWIF,SW@J:BѴߩ27,c YqaO[3 W Ɩ+fi; 3s*(O LxKf2Й5qIY{\ltWr){~\t8e[V[jLNOoCr;/S%c\ (Gt]>t˹)j9&`W 839 irW*5_Wq%~l%{'H!A@Bo}Z | KD~# Ωkā"p Iė'i.IjU1:D~ xǦ#FL*mۮ5^=d0`z<7=7vȜ}+J| "[T1qeqYa##Ŋ_2LSm8`dRr(L|;nD9Q_ }1i]B qG4/}8lv{kۻ/P_^I81BT؈X_jη/}M_{%Ίv4GX$LXaXPZIY}z;TDhwuvaMY8ޯ `l3 Qɛɟ7{}7R$j[߽N㖑&u&hƷI1z[Hq7 $|wrd9DW`܎ӽp>>Upw})V Icͳ^w}KX'_B_Ӥց1}ׇitDL #J ?΃{[?`}[i_Ag>=iJZHoJgvO:EyvU:|qpv]Ĝe;h;=&i6]HP.C.t#p8 )SPQJP08:<k7]ho]9I貶uஔ\&=Ҏ+K;Q*?v.a ۂ4 qD[bXwL"(@shEf]lSt[L^snzfwi2=9yUڜxEnHb#`#TXǬ'Nn3 |[M hxkF6Z&$Z1b Ĩ<4*0Rg>/rռE-ˏ4JKM&E+=$TeRY1[W)"Il%W{ԅ@f?&$? X SO4eSs\T rE2@em5r*%cff#>;r_Nlbҩ<̦"/Ѷ `^i+}^ʍ5\tS:t06l]LSƈ;pԹFA\c;nGG?uC?uZt#LHG`.`o7ڄS{+}SWr8[:h*G<ɲESaF'tCH}lf"i~OagĤf5I6FJ%.dN"*M^R κX<4F &a :IH:̃{8p:G%vT=١u¾/ %;v@>$,^Ԁ+^kw902_kTW_=G2vD2EGYOtL=9ĵY>ژlP?|xl@9d~=om9"=2CQw&|RFNFM^l1 ÇIC ³#Lu38y^>wZ/\gG3Ip nmZQsQ+rt^&~9\x uPP? "(T"Xp[Jx2ڞbr"نF,X3SE13o5u+מ8'"Sjmjm5}Q-LЌnXJ\8"Yp^vu2-"~:rO!^@v90C)_y([!Uw#B2؎Vsm.0ncW\,z[Vnlz5lXxf{+`GͬL3LIYνyZQEQ+^// B_I)yTnp#y)[AƈR3 EJx$!/n7`!No૾*!\e**V fw;d>OWbՆ2&\<ɔE|Y#j)'V 3CΨ.kpI9d/`YFuǴkl=&N9"fitS=rw3F>˖A6=Ό$Öԭ=SfH -VQU,JJya)ǰr)y)޻ߌm68o:B;~Uv?fꏙ4VinK6Iy kI4Sk>=4Ob^cx7@<#$SA H\iЀ G- wi*Wb6_@(SFE!"C\: *Ā.6څ hF>UFd'/q cEs-W!<:*OT$4r'T3$# (lf )Go;Jd4*F>UT`SE+pZ$R.@V$Ycb` hF>UϦ :ٻ~O/B9/ȓ0i8pl Q'|:|>!u1X1%/L%*6yFc)0÷U-`\W Up FOG<2 -sH T`.RirR!:JOUaγG*~ d@8L\(%QZ#昣!<:*OTх>l5)xҴQ^b6VU*MkjOU3ճ%v78*bEBTsyXbfM5 JbNÐ Z!|j֏|ե^LG{Mg9-0ÿQ5d|q-|/ؾ,T*qrK-b *d:Eg@S13TE1;\ES/ebV]'\"LiWmiXT!1C\Wv)*E}eCt #l!oϟI F DT:f9YUET rK;F& *W4OqhM_ՈJFֳ+dXסb1 AUnp3Of Dd:,7G<}@KX"UZ4dsnȧڀ͗p,(G+ӴB0 ptU>`tŻ怱i"bn,_v, op@F?yN*m6?WlV*"5qc1DǺ/(F' ݣASUnH׿} O&?.4/n}N4KeSEFğ>(ϡ:‡?)TѨF_I:ڡ??}Ȧ`:CQNvBr;| %Xsl^qHgZ-URB*ͳI Q*|R \vE)*l+t{di(0Qlu{սOU+O <XDK>NJO+d6K֭ uA͍N [oD\7G*LJh???m5\{ >))&KJi{~N ފ*8H)%:xb*' 9)xʤHokb?TU 'mgH;r/e|&әSdHHlHFB AB9`:`hxD>Q-x*pO4!)K#JSS/)6MuȈ{"Du&̩8" b8,jg{y޹y5|L{ ;ԧ"z@CQOgZyHHϢ-z>Y"(jg,aQ}0obTp!8 µ5x0HΟxٗUH} D^kA1t?cT+iٽGҠaw21ZLe_ SiжWd! p>$!C2eɌ{1h pgA w!}q6];s8}Nr_3RCH|kiSs"t f=RgECXnl"E͉w>!P?cRIY~4E'BC-zE,Ɂڰ;k#5;$X@w>"!򎟱X|+zڪkWV"MPlfܳA )HE/c-^~ xnbAs,!Xdod,׽ڙQŭ@e 6`0e.%^r!]_"&`XqK ą.5EXƈ<]d#8`V4S߅^mB\oL.cA]R D L YN2 mpyv_iwHhw't% 7*S"cF{pRdOٜE/c-CJy *zZ "T2AH!XڪciX#AJ*0#g0 "竧uWd6_ֻ=\oЂ@q1rDJDc+NCFȧ*\b@Eq׳@f<1CrNP6?sB"'x0'KzήXa_Q} @w˵[`1uXd3 ǶʦUT"(BQ%P\E<6dL|"Q#y|[>3T_|2 nTWWAGN#ڸKᢐHJsZ`tnD>Q͔kb1E\үQxFvvfd4*F>UjǪBn/Wځg2)HFm)N[F'"׏c@v\2 M-/G@&jYݗ)9i!B8ZcQknހ]|oT5rUyuCkH@^ә-im"T>c7;&x.\P7coRXCT8*D⤷6 ݝD>U364C۠t '?C"֠؅ cdju h{np"rQV'P2I A=C{%aX}ëy7;'>(bav.dv:1I+d;㢳;|ɎWzh*`[p#EcR4 7FX>G*Q *iVʺς@^ Cm+In+IDժÖ~k&& ȧy.Ϻ_VɮxΈ|<У~Z߅=:l!OTKN*3cyLOߒ=cPεCQ1q .ufW5L]¯߹䐐[Qmxq}APJ3#uwRH#Giy~/55dP7OU1b^@vK \@p= d\8D>IiPR>{~@uE'*XR%?= b-:;;lH[> Qӟ?eK_ Vn#dr{f^זldFWY: (yFS,kt?~XxR0̀ZU @Fwg-|M9b/3ufD ˸Q2|rg\iZ s傮 +PxtER5wKۇT1]]P1m5s&< eTCF lRJbE% amk]ȘnD>Q{7FLaPre r˞Ru>[QGwꣳ>Iĉ4uBV1H+MoaN2h'\f:>nįpJ (D7/:?]^S6o9g70', Wx vyvb(30dl|]-5 q^b115A muaֆOݬHٲ3cSr6x=R7vv = nfm|G O|Qkp+$ h1$߂K,hBYc 1d< Ͼ:F8ghcCTAEXWUެ,ʂzU%9jTph mW*͵Wj`o~4A}38Ȁ?~|vlVۺmح/x\_tT"Y+NZB1=N[/X=5 F|OQ.&Əd}y UUrΒ÷woujY} x}Kخ֏N"VARW:ad̏LwS=3bڎB8w=qy@iy+ɛx會63hЦ_uFYe]J36w}%:( 4ɔ )ACDN'x`ΌUp9,A_PżtpiSD^@@C 38Yao"m~F"vZ.xLpQIь (F'|͟ glr7@ag,23W@%º^!Z.{X~=16CXb3W 葺,;WA=~/#@o+!lM3$'@{]@W dCyE7zp DJ󱮅v#@x.ؖ/ֈ]TDcw{P_*(*PP>]\xzAݮ628UHj\]Qd{SƢ+oh8!b L`N+_3sZ$ Lq2ok*΢H 9hLtN|' !\MYIN)*6vWjC duCF:-Ia 0 "x[F;6]Pl5 [{T5HmRݲo[û- V3^\c┇*DawG|7E :CB͡jLő>%˴/_yy~9*M^ΉG!%ɴAF\v"hXIDA%^@|af}pvD?QhӜ"D͒B$JG>0Ӄvqvq"Bȧ|~ g[gC%|ϡH/{@lOY*[43]^2_^?힟GC (dFH;Y"R͐a8&j$8D}׬q]0\TrUjA=-,4Vwu7 r$jM 5gH0 `sIŌ-ȭlhy ՁzKp/dք|ٻ8rW}^Z_ , A}8"O'Zcfsl?\[ҴTnq7ދ`zdYށ˜X] 8!eRxy"T>L=!WX\ +n"t$ogEY:rPD NOӈ2 G3Ua5[Y6[UEJ76W|@ u|HG|}b0ŎqwC!.vʱ'uT~_ld[)|t_Qkm@9MNPBؙp-{xkو#Q(@ɈB#{Q:f`=<Ƶ#9tΧB NUCUe]\.z.ujkB͈Lg,vpTGl7ĭĿȉ,)М%%zEj?(v;86@g0<ۈ4٢My/a,Y{䒍 3J׌6 if4ƹ~ ζ+sRד qO${8Ԋ:z fyOs_99TBV브7JS[ǽ/dsεҸ:B/ͻL@  -Lቄ*T-^& v wPv(U1BBf\SȈt!/M%:r"с*$jqhf{`Q_"^g;0>/kI%\`D87:۞< y"fNǚ9 2OlÍϟ]+aa'%kaE(\HrdaQc2G`DOb[3·˘M4H%g f+*8jqP9GC0*UAqCkCql: UpкVs`56=x8=J:F2җsK/ i8:n|4ϩb*%X` XczPC0Jt*;Qa5b-jv<] ALsN͙ "{-GDrw2Ȩt ZP%N9(K\PDu8YEky !A<Ϻt > JtC޽FJ &ŠZd~KGGvH5_;t6ȨdUP _t/C$fOd šD!eN^tBW#_Btɳ@#]޶L!!V{ӊc G3DS再7W W&4mAH]Hxod1E-2)`y_뺒A7ܫmcZOYO.IRʝzӵ{Aj> 2J5FrB,` ?,_4lZbeuaً5 w/jYq!-J'ozƛnU  9Gɱ&I#QTYKyp!M=^~5\jξחy7(roy*YDFk{ \:@C dz-T"bHIdeg:,O8NN9"nu<ܜ J-ܯSxPjQlh 6l3 W $!wOW|!قR[l+Lgb` D` `g & C@znQnj͗}-2S{y_KYY_&aӹ X#܊72p{:g3ϡ##}(Oi3j/x4#yO$qY`MX-\gMK/-^u`OW|o 0"| 9~shfy6oq,ybPJT@&x\x.p =uz՜ 9T$SNxEzmudnat8N5Vq:fG01x 5 Łszί+XkaeB9dHlL+pB/,_ d@.Koŷ U>^1P%精YTsJY=E?cЖY5Z`)J^kċ4XSR1|\jZϭ* ʷ'iGxjnI@޿0~OnR2YiWry? @L9(x*$B}m>D< /b<>g/Wψ74ybk!?*D=[ &ͭϞ3ԢG݆0)ogؖq$D` <'`(F.{2M( M8E1"8OR8 !/p^J#SOo}TH/9ĠUNY-o$YXQ#?0x(^t嗓M'C!DBJnڲuWGڌӺCzmmbdh|^4=t(/ K;Kq.(qxxMRaexFRiV](t&[i/.{ ar$<퐬)'%K('T6^a2ۻM_z'zn1='FyYُ߭q g#6Xx,]=v,>yruZ#o[$ [pʟ?l?L/GbLܖvyr8QD3nc|'>SZ6n?mmigCߗgo3DoE/?[~'.G6L+m}}<˜tr߈WV]iy+Aͩ_u ~߽[Ft:y5ɸo>6bsd0= fg;&wIrԄDr<9i ƒs6gN[Eކ1Zƺ F_arӑV,4UD!~cY_Wז Du,E]BR'T+2PVF.RSqR4NgewnTf1݊Y>/3:ϬGO&2t5=80YoaዬN_cfVR7Oo/<Nf~o7,xlrY ޭބw[i./9=aWi'ڧcUT0(󸓥ys#T}D<W/bWӉ: WeF\m?}SR05:Ku8äHE &01%oJp@<!͒r;Z]Zy(WyAR^f1/9IYCR:G%,qfJ 8?S*IJO8[=Kl"MT(ㅍ^h 2Gb|ndG%s9 ND.1@H9JpFjhXyJ (nF.iF|!ohCC/dC$؆D5!r阍Y.$kBRL$/a9%6nYMz: OUƱiHP#Us`'g&;S4E ߛJ^g[j&?v#JlIs䬿ٴToNjQ)%QHNxRyo#+Bs (I1|~2nsZ[^]l=LXo4qob$T0D1kAsyQq >z7@UWSDFrFwlĥ&km ߯/nVʼnvzU #2$hK${:N'+AkLe?Cbjɫ~e3w?iQ̽,1|xOW3㻻wHLBGQ٤ާû_uft^/Vy/`gWo?*$/=$.Y&ٻeLy>9Zh2pVWNZEZFɡW0lub$E[FJiL 1?X5rwӝW)Ø#T@G9m#TNJbE m@xmϞjva r:TQt(\UzI_`64 'q#GW~`9xo`0O.2;D`v[&ἑu e)(EJv(2(} Rb_+ OY2>Z:xc]|[O߆nRdPՖIyߡP־iHu]=M@ 8@H-dW'jfe5᧜ V[+eMVqFqC&BU;CbгG\'Re<XGeh\)\uCѣ_z}WCo0Uяis p\*tK\- _.=7 x+w(%խ{ O>Ri]ͭZ>^ uoNSjqPe`ڄ:tI\Gщ eYӭR~LP̠Ǯ'?FS"TaӪv5s },b*V[Zݲ7CbU{TrQ-~6䕜8nv qfb@궷Ob=ŏ_>;ܶC?嶺? ɇ?Ojn:Ͽ86 x=F ko4H!7Cq34˦MjǗoW9æ1 Am`Аս$Hr ߫,# zqo Պ֊l/:5>-K m ăflԔkꇸei7rqD+neiS(:3pʶ тMO(OiX*8Jc_eadrǧp8E!hDFsZʌ  L~Y۔Wrl|;I&YY!0JZޮ[zrˍ[׻eqp+#e$(`M1O1n<Ι,p֩ē3Ejźgv{Ⱥ3'^'GYv;'ܢ޺u hl$@0ٔrV nC:/̥l$ӋTJOLj&aRI\zc*(A,%Hy}p%NDnfOK[8,y$)_QMKRolr%&epW>V2~u>Gz/>.s%E4IX!8YQf3ddH*%fFlY/^$=5. RwM d"T̼bzCem)2*I$I)Xa".'N%%ٞ8!&*$lAlWivkVOnZ,KT僀x 9J:QeRG8G˃#yȩBRc9 &DKxU"gF}|'0L"1R %UM8*G9bF E2fdξHO~D`$MHu2 bAGas5;!<ܛRpT&\%}o*+ ,6ك]F=T2P812"ptUej1#-r=N*$=be-a6ˊ!NYV?bD~942q&YR\ 5'd-ј,NoDPyCd ?RemhtO4mNP@68Iqy(kAFVWbqI 'H 1HLxljvF8N'nk@Hh X8_ +\1yd6KW&,lV.ݕSލץ jR'hӂnL*)kb+N >:N 2QJڜ33>8IGeYbtd?qOqbHHAP9ڴ Q#DWHdd\Rm>)`|4WEkVrg~ wKeJ/^rI7>N8lrjjكw1Vu)!'G gܫ} Em"qV>A8G Hy@j:pT&U#(D)&GcΧBbL鑒*DLgU(=_֯CK` 6eHHrlEvu$y^os29ʀ҇Qqd߮ZZLI7K epu4qy]֭B"F:VI4瓟?>ѤZB,~KFT*5ӲQ8r1;_Ǖ&30.9̀Lɳe1kNw/ϯw "isD3{B&gY{Q(IVaʾ*S]=38rTL`mR ,_Q\#LҼ7:ДZ">BX< _-sHC8*YNizH8jZmmbL'!*)%[A&j ;42q&3vڪxi@7֦ۧ{Rv$3bOiL:+7&_O=Rg?{:N4_`1 QXL->9ɏ5S߿,k/w8tLj 6|mU~jedȋ8j υ] c}nb,܆N1#ؿO&׌2 6渻rZ#{R5!>ܚ[.S`j ѐnۄ vkA <ijYTCn16!MGYMZp [ k&5㪄b[|dyN8v)xs8ſ- ajq0m=ltk^Bmy Z+6V< CCr6{tY)ɖM_ðlMkLț{ HveɅe;;lKO(" S*{Gbc$z-Ȋ[nC1FI#p].VqizCH;{DDOp6ŸYpO "ݡV`{^oC01= 'A usˆKЈ,||[.,hH4@͙{s(LJ6%s"I>4Z}|]ayxw;1fYb C= T!JunvփG\,\4s.op޼dlvb4WR6b,Ocp8 i)A/_ѣ}?Œg܆em!ju՛\~ϰ! ]{(fT; j<*: Yq-FG Sxs?[AYe!wۘb&%}ݓ-ʟ_~ ߿=\7#HDM22ZR(R"<-^^74|; 6fV+([픩לhnexa.2+4gѨ6\p$P;|;5#!CbYƭ{8(6V]?; ym̠f5h_fn8օ'މ5'w^ eY&$AJIH'C6lҋ/+Ao2|L$zemreu}7x=\; Y!,O^!-L@\9Qǎ\ϝ'r`Dx> ݈.ڋP)Z 9JF)QKA%+ +RKKVi'AՌS21-:CӪ4w4 aʰar33.h^kW#DІ=~4tRIē7ȘK ^]ƒ,+Mo,mu#A:bܪ]Gk X{lCO>_#o/E3C߿LO(a \#!tVògx zS\^Vۘ)ܐ+$H<YJD|Č 9 \9AoZjZjA1G) I/xqes&;\آ@<1ӌ ུ:荽 W!>xmQzM0\|ܕTvΆMENno>wXiuD6|` <ķ=sT i|K AWPJhhH{6oڔGs.aiN41Ȋ;  K%z dԮlV&kk|򩎃=zga Kd:$ie  šDQ'Ao]kw zr6` $<๋:vp>{Fʖo3A ;K^AVšv=;=w8rc͟,^"rQ6)&P =w]M @aJl.0g -!`)>/eC` 34,}+{ztsy|sf S'EǡH?F)ŭ^ێiexbX`fk`X-Щx2S0lf %"].)1o/A@ta_ֲ(p]f$L6| z3o}_#w:Ǝoc{aɛ<{ fOfa-J}=Y6Bn + 23`$6FreIش.k]Ef|]|Y1v ޅU ܇|b.AEf=(m[a@g^9ocQLHV$ڜ tۘ++_݉6Othg0L@$ZN uǙ8Ff{ȍJ8plM`.ٽ $N5#KJw˞`naS%65TEbYc(5"/qBvP!gTzt9'x§lbst0d˿-yQ|9k3}[M~]ZWh+ELtf<(ɥB&#^N$cޖeIϑ EC+綘O=`ͧOӞCxnնW[XdDK2FB  0X"rcc4Y636BCI(-ѓQ"%,Zfdlת@S+ _ A{4E~`K﬿`| 6 >zz-\rt:N9UR(SD80lZ m*N;tFM+E5`wM)i [/h1Nt|*sp5]L?n~w!U2X@/fetchs69CMdMv3mQͭ9hD#6_3d31tNR w%&Ŏ*$v%n`[!Nc5$}a tw?-ڠG;{N {c3X]B`(,FB!cK,5J9*w<̎ zi/~KET)`H@2pV8W6{ *P^$O-e;d |tr:8(=EP+ԧ7#q'#!l%m!mj$uw5!;M6pS<="7ys-r)c/8`AD;]EpxsrtKrvJHJ")0HGSReL9!6A3+ԾS,S,6Gp*qHNDowWHDZbBX)IliL$W΢GlBh>YWHi+B^U\=29zǟz,o`A˭_lz7~cUe4|V0&FfqIADbOܥ(8pl w{'?-( U8d̔nCF%gKZk7*,ܥ?iOAGbVf,f p9R)6dάHY\9hkdHX \[Ֆƌ R2d*ё+74#: Ϡ(E!eB|0`"=z-!<*l!c+Ŕ綳qQc͠Ҫ|jJ3 H_-k<UjVD߭j{$lm0!g2ab8j/ъ_O+T_/"`e8ZeI =MQ') 4/1؟ ɣoヘ )\ K3˺mG`9bq|Wch3En#ckHlpzZ'M;Ij)SIƠfNšņ 3zaø2^|r½{3kjs=6Zp )$!,Oվ?P^ j^Lp:YsP XߝKݠ )yvfhZ T>ӍހT3Wb>OUFS[-v/̃΍'6ztWt{3Nq,G{ƹ.PVrݖwJF|g.hC8v1AJ *g#,Jh3[u{јugߊ_~` =ܴf4uXPv2k[ ( ܷȐ,VVJ#0 w1ٽ3 IZYng}vXS<zA>QXٖayլ3-&96Э2~I"|,Aۍ t !;1nt[(5 |cVԣڲ'YrLؔ 8i)X%·4Mѯk70 %3+D/n])\O >dh<|C IFTi XN OH2MdR}Ni,PCj1e)X)q݉TӘ#ѺpҠl+Ak_ s9@p9_q1%)lzL'38m>ec>pzeҺVO⭑J/>?GT|w;[}>wՏV?;f|K踼2*U^~ @yQܳNX3@ ZTL xٜc7/͠M3T1%!\ÂX̗5XxygW,ˤ[x+a+WVETaE ɲ:RIfqf1Np\ИKi\]zqץk Mt2[~z|j/0ϗfUJ[H/ʗ^AxvKro5hDGtc2/&wBeGF}KSU~5dW-ֶiwD5h+,KKDT}NO=#EvKDgR[N2f~K*kf`Me+z.צ'35` --naYtNoV"(VrJ~@ }\TtYg0gjD`;װS\/Kh0ȠfǨeCu8105`n<.\Z[9|zY. 0~\?|32 E^DI#r/Ր:核]|ﭬ9aֳRZjrx|8OݫYU.s{'>A>:>ގvC.r6׳q ϳ)ANI-C aMSG[.J oVSj "y¶'8`co2r~:Tjyn|&2MWj0.o.OϪiDR%`JIm[wXs{ZNm*8WcKXEmK2"˨Jg|ԜU9̛ߪ뺼⛦>O:|{>A [K {Ca7moZh4 ߭r#|BVxK]ꏪ_R%uMcS@Fu ]}VIh ,kKhr V~ܥ`6F|Tpaˣ&_z1̃9nwޗ,qCA'9ހKks3&QvXvZ 7!2(4>S e>(A;d.)BhZ7-]},]&S;zD^jxO.I aBk M2J$jֻrkg{늩jPno&EzfIrt ֻ4u36ǩo=IJL8p^%GvD 'R✓(B #FBTJ J݈a^m3"IT@@܁a A.p@W"&mUg tQ%;3dV?bl `/Q):>ǨdV D)HhpEB$J:zgΌJ]5Żqb·ʮBGݭ#0=_NMu٨Pn,*h/}=s]C_fg(mzmm)7*ixӇՙoዖ%WrD.z2 ^M^%p=P;*wlUnVOD:;JJ 4MOwavDw{gK^R`˰l inIiiҦ0-Kt>@S7!s Ѻ./LUJ1ƤRjs儩 ܁=u i}d_S`Bs,i32_ַ?{wX`vki^kiUjw4,w5!`3֞ 07(|~;5.6Jm/!L*,SWH3c^TurjDk?!u/_5fB;޹a [AekTۘײ$>Ћs*Q~z];Wv/oͼ/} 붮/Zv9c_;a؈u[gw2\Kq?|KݢvXʈ2";jx~`;Sd3Pߑc@IGX4aʘ Hec:CǕ`= Gm!ٚ/~/=h̘giH<{% K/g1.xׄϝ;}T:#"L7k; e?M^џ }<&7g`<*%3 0pi)PBb)ҒRNDhޑvRidv:DD7{M]Che'-'[ǏΥ{"[7.Gnղۍ&C~3u  QFj(Ďviɮ rҒAU`uƓe&6"ѯ7 _fg{Q:՝]%#,}7< d_w:{T0d¬JϮgʾTiSjՏWgO {{'iXS^( ŷwfv )oza|OFn,pʹKqyoyݡ_EoP~kz_G{ (;_ ?xRwvw[9'Q~D|־g_z٫n=(a 7!KYUҿF]VC.R(Ud!<OWx44j3ELbF<"m(WډF5'EײP6Nm[n"(:QovmBp?:ȧ'k<m)8+8l E^܂Y+@`:kUvŴJtӢ=ybOXrnsӨʑK\(o>b{ƒDA٨rSz)x꯱ 9+hז|MξͩaC0L DA'\ ,;0X8 @& Uk۷L^3ՙy=cyT*J$9HPƌPzpE:DD1% p&ؽ_ϳ N(Yo_:a-Ve슞zRkfg3p *J5@1 RbMG6sGL"I EhUC9쪉]o+*fOinL;/֜N Ӏ: V 5uYx+1BDj"* 5rJj*h@Ё}@1#"IbHr `EJ'8Y_r.Zq($.Xkpᘲi BS 2D:KE#+HQf4d_h< XQZ6`ZiĴ{ak@q}D])Un/.cg[ב`i}ghV+5942̍)yPNMtl?&9T$B2V`H&țcf^_Rn{FMxOs;wr2Эy1SvA=􇲫yT?\^aqnp؞-),VYb|(; 6Q%Uoa"u *{&.Z?VxF\- %dUo /cu9/wsZƣj1|gR౧tN?MX}!!CXry3 p29L ')0!l|'K}g63 vf[%{g[-*nm5x%1Zsvr* Tp8ށb*vŌ+MI98o8?rcl q{EAdbAclP`9&" ÛZ1)&P'7pWfN6:8ΆE-̘֫o$ 'dn.תL~Wb%;>r%Ma9krt3TGQ7Tκo-rޯ{;N渖 2=HQŮ/ikZU afg)j=m^"L5xf ]QWg{ ?.dӇWCBw{Q˷XMc6gFi\ࣺ'sqe793lsJn0t^e(9 ]Zűҫ&u#ulso\i plt`z6Tnܔ t6Աs-8-֣H $128د躝/XCᚧPUIսf>/Yߗ?HOJ}f <2(UȜ*[] W&wۘz[\DzLx,`q\wqmY-T9$V4ײW5V^|L4 Eea}Ȼ" t^> @.x =77:b-.{7!ܱ{J2Uݚs?]sLxzb`(8Hh2,>Ghp[m7+]bFkq]r.r>BqތSUp//gk‚l=ind5ts<&Xy*X[!(aFbT NrI'}Jjq&Rs\yD0 R[FK-B9++NM JdPɇE%K1酄ՀhCԓPmmo㸄020+<71O+{:ON3!E@U$y9Hq) 7\IF6kKS"YnȽbJSx4tBEҀkȱjAY D3U$J G0 jm2oCc{imòX؝]ۣƊ׹y2a0L-049*Ia v) >F[(]ze7R_w͓h_\kڹ9GQ+4ߥp]n$PccC[z{s lK$;9zl]/3ֻӾ OɡO0Hȳ2Z`$)%!b6a")a $Dl,[7XR$]3`ɨ`B ?/X!,ؘBK%h,5uzE5'ppPgAjo8SLYNv~ONڎyc]+B~ֳEgO_?oaq'p;WV? GKRn[\Q3쿰{osʍ^@/7E[k9Tk5U|MUͨwIROt =X>RVK/BiC&|DX8)ށ>d%V}7C|pv 9!1AqBA,!֒`50Eu lx:1y^W~PEaYσQU " KAwN9hBքz9&!48VpbdR@ Cs]hל[ÎOaJA@!ӜV2 l:ץ/iCjA)MkB6Rx 2ZO_O( OCT_6U*]uݬWI>ͷ4@WGfA*AE+0@Fuuۜ)VY T>ڋ*dUN˖*>ȬlwgpO Ry 03kFI]Fy2放u"F=_.|N}|rbUsV{Uc&1 t:f}@w^|gaŲ0?1'6[gg];ʝ{B}я1ɂ؍lg5 <7]\ y[OcR0]sq tqdQ+JiiI\:M;&k(B12=_:XaūǶVTNy)7CB9@5<ArRfs ~p[89nS))Ic$g#è<0mڥڳ |7|!`Ģ52 qG 쮾,ǕRJDcLu#1{/)A;DK>;RuUG*p8a<( sNy.dy,W!٠N/NdiI,_@F"q@6ݞ3qe!RNW9l0f"nY BuL.?PvK .t7Nrюm=L%iȈ/Kwm+hQ!`_Fgd|q^X#:TŪY2-@8|1{؛0-CYv~ kU4A R ?ꠊfzS28Έ$P)D D\Dvj8Pz9J)@F⌢2kK `7[b4E Xy?}-i[ȌuxAo"yJ`ҨG1xoO"^se8U6YCt@.JX'JKј@~'QㅞH)63R 9#b14tT(ej^p8X)(O4#Gm!o80PWI?[ms 0Hm5Pz\QJpY([=_c/>G J"Y"E#⇦uM"&+Ak-˧^ur!4") ViߒY$z4[.kyٚhӧ]A&hR>ĠgPE*^]@-~ӼG9,+vsdA׌> G^ɢE`a71琋 Cd©x,E;Lck~D }X9"CCV.QDRjx,z$jdN4Xd5HpN48!46'|ۤѥ w1;t\8jx,x$6쉞l|F- qA&wȅфUX8, 5}i:,,R֯j(Wa=Ewħ,@Y$#( KdP ~#0=15D>  %S|IKԢ"VEFj!f!w,"3TR3 vh 䰒(RXFR|;-{m4Y8Ycd邎έ5QU=! d)x{[q@3=VoM,"h/I87[t~騎Cϲ^IܙKrhL2 à,VUQ0.03ʦhAwqݬ(MՌ;x|Y'Rx(ll޻x BؖLR^#P6G wocϋrVy sf-J|G@0pb^'PN6iZnU5%u?Bѻ*[?i|/@N^.+oۮm$QJ媵r6u0^ikVPm Cdɴ[la-%= v{$h(v"^2*P9 U<>C#ֳ8v줛i'&n-i28YK(m?M:{N7=Nf쟃4wVLrQY ֣%'‰U&wVY (h#Iw0Z)|=F2RHfu*Ɍz3n(z0(?%'dgYCTz(r7￞oKJ? /rjh̎ʋxFa+q 'w4^|E!XJ<2܀ri/a|ο}?_}}˟?W{qN~(!(cXŏY+Lb ֬MX.g] _"r l$_BhO3ugUEsO9Z:ZqnhI~hWe8xxu 3BFgWz:`˦8.dpo]k'ڳɅ]^0#gJGsw)Q Xe֔c)>1- 7jPãgV|V`x kGd-9ycp;g;_--B7Dqzv\j.F|jx,j]{X bՐ兵r 4R GF Fpq )TãgtPz]y F{nY FN6*55p{nTdayPl6$EJ1ge,qD Ēs}LU708uSGދOب?lz=TtƷ(ѳp:`{d jɀ=KiplGoW O Zy5RJ0VTã_j=S<T-i< a ~_X-޳pm4W$'I[pKQpN`Ӟ\#M~gv%[66زe^vfG̘+ [$xgװ]YC&|W1~ {_  j7l˶u,v}Wã9j61<&,[A),:Mg.3 4W2:qE3]Őufn.`$TWPC]ԹCP(I"Be6IWa{<^D)E,SRR*j2FYG!)J4Zb_QJZQbzlb `nw Irqa*S-n85,27U1yKxX発H6A=MfIbeg*ᑧ-X_,oX»t@{=?$1ϏBk(vEK6ai z8dkrHR8bbpJJEkVwlm&Bgy6S̱Um!K] $ߓk<[wlMm{1&?.P9MJ]g 5 ~p{/F(7`R/>+~gZ'Z"$(&)|7hD\&R]x.1؃JyEl=՞;VČfdc~0#Qf0bȠ/B?E}6_ґ VEF Fh[P%p\]]lZ@]24e[f4 7~köF.M/<"Ig]p'nd?8iuO[6f"`=EtzW(8qeuwvO/͓f'N6ڲ;n緣VId&r{tlI7Gl=Dw}6d=W.*w nw+q5dϼ:Z 䉃F!b{ao6oks_h-ߟ6o6Lx=fI_{S^ wwL$@~'|b h}p_ui+2ǝM0$Mn}mG+9$p~/dxw.\>l忎m0O(Z{r9z͛>p+l4D49xJB*\HA >u|,k|@gLbx{ !2*%)>M`.NGt#YG(@ы@ GpDwzױ7hLo(}!g3亦 ])cQ`UnJY 'ٚC} `ȳLȻTLHA xL=c1^r!6b 90aݫ@ Icg:V˃Hu2C>\ٞpO! w{h3^Õbp}0_,#@:h9#,ְkŽлmG1Rm*e7ƫ=ڧnn%uco|in܇}^Y?vSQM) oa,K^iz7'O0=Nw0Oi Vhb~7-v_kߏ#e;f7[f zoi);<='[?6V;Ć&a|Ȃ{-en,`qY,|Q\48KxYpm"6>6IJ.5yoـE@: N;'+%;copSe]t[na'O% rv`ښRtV2i8t ,oKMVM^䮸]W{G݃Isp6[Ge7m΋$= f}+\֣gd,KNJMTGӋ^A~Ϥ.3_>c#*h6y.]] f3 ɵ.1mS8[(.\H A EV@aId,&>5^>ʔqf?"c|F,vK+Fh`(~cU0h-szF uS7AuDV۳7A,VHf!Ey9euϻ]W5ծ::ռ(J1=GSMה zhE %;@0ϫCO*E | (S(r6E2XUt̃KMK/ϭ0*U$#GzAS5h1BjQ(0?؊}Tln;$?J #&n($/u2£6B { a)NB_y`.)k44čE*0%` 2&HV{e,3YaZAgEf`ZG-U0sk-[e{TYD`<| &aHV|y;jޟhJ| + {{ٜzs jM>P.9YI_p,taytҙ?襘=1AcQx<Ք?b?Ly)QX'1ՙN`}+4-y٠"k9播XgG3V\]>qG^|3A#4ΰp,?w )rC\N xp@פ<9fǭQNpIQEQMA=xe%4[8߶6A5Z8 Ht˻,h}^Cz%x NhrK)OJbX8x !g"I㑤N~qoD˃$yqk;fyIk!Rp)xƟ. &e3GF]^{ݰ?܎^fx3|q_ GZ$H+& p:+\)Nr: YZ4,&?&!Hֵ^O$IgE@QK~/6m&x/@= wI[-`7)C) pɀK{4xY'+2;Dy~V@r#>Y/Zߖ J7>|wv_n D`V7L #-"L^‘_$\ XYzvV^fur(uA\0bDa@zAN|'9!B rp@}L L T,?A@Ƣ , ?ܖ+x$s3 D%l^@TPsD‘ X(@|w&Ymۼ C8';V8R2x:+ \Fߡ$9.8vI&i|5򂠓;hA D>q@;!z|BJs/Hc LLj.;L>( A% |P%rc'ގ\^́o]9B^ݯ2nnGĻcٹKݺkŚFWm>nP:?B(}G~J)a䧄R1_qα + ~#d Y!aȅl$LTLt=* K[uMэPX8zWL`]T@* wLj, o;nKtفM R¦S9G~^I [ ] }[ahb*d19Sl0@CpEtF;wh )sue0 .jxص~ AŽFg{y:/*cu2[:/[m  M&:lT80?_O+|N(k0iWٗ]~e<o;"|;5F5<hG;(A7iEį 0_Il eid7''''U/&ԪO?B =IR$]+MĴ!g哶߼CVQIL@2,=)}0p.٤eF`JnROri@%**Ut./Z%T\?`jL_m٧d0y)lug,ϭ):0KSYonWr)oPUu?ߌ"o}&8=yuucSxow84{S?A/_#Ī9y?yvF-jK[Um&ֵ-&ωy`0'~4)a3Zϴ^|bh wMf ӊmMmQsj 8gmf/lZVY$76ԗ㷹Ti܅ڞdou-vgmֶkHU.m{ǭcoMSN֎G+*'g0o4bJ%BI"hl1+c8TK :DÌK%I;qG2\Z8,9F*XHlѴF.M;ooіMJ#_? hV*)qoG & IiUxb6:IPdL"Γ:AS()8JXGTr.UjqVp%[8z{9?U3;):Buȸ,(7Gяvf3<Ա`"d\Gt#'fU=:Z;u0=moXܫ; ѡm6|(̫AunZwmtd|a #j{zu#E}|Z̵sd{6\9' ;hᾪlIG9̅  {7o.L֧|B]o83FT2&U+ޒnVgNw֊1ͲN,nĻƩkM3n}Z un}MExYܣNz;ESU;}Krۡ+('&M["W-sK;-sK;-qn.Ot9/"&јFVZp[z9K0L04wWbsm2/IrC/_A]k{K"ەр'G%C~#@J|LM0ɉ9G.̄u%8H%8@pvu|dt{fd'l'xYnO\wȀu֙:7]>*mqŅz[\XwVa/ɪub .r;)NU&Z!:4 ,a`,A\YN>րEi\GG0U`}Z42A07`vS;-~"]& @\]у'& dLaTAamppʹ1W9q;ƣyo<|t0ssi>(aMI4.E3}Z2 {g5. xz@15emۋK|sMVoRYzGc :CVHhKj۽Em&]Fx}:s:YpDôti/†*l:ZE'5Kp >6p$zfkуR'%}>nz#,\?%R;oy"gng9]PjX|ʿVal=Od3`Ӡ!8ճm8΀%9<~g=OOߜǗߝdN~pҒe ȜL' j,!W-p8C<|~㇯t8_ 6C5o^ O:%4d u(y^H>-b7ZwKyx>|Q~BxWNT_'/ꏽ%b \.S/1`w-*&kQ}Kύvɭ077gWk]aǫMs6ɵƃڔӵIӰ/fJ熇AARZ!zC޳I?jZ`ŌwhN~Cm6ocvWh/`G8Uc": .]^/b2ž|Կ2” 2W(B󰬀U߯#xSWaaϦ+ 2]\uޜ7gy79)#7Z##ISXDzƆYf.5հiSHu6QJƭMF\$@ZJ7woQx7;u6+gի 4D6qפ5E8S7]Qv9q6p=a/kϏkFFʜ}~Z<1x̳ib $!]ю9Q=^[:l4'x2p2E2PKrTgV)ӂ^L@=5`F|g #ߺ\7;&[d*hw&cdS==e z8H}iw= swkf{F/ 5egR3,c1M[ܺQѡףCG^zhZIHqԱýxBiJ8XReOY%]/DŽ|яForj}h$֒$॑1TRI`J1#X1dhDŽnV[4 >t豫o`3C&2$lt3ҫp_ZUB&,W teB؉DdugNw}i~L*+F1j*Ϧ Vt,(Q(2$tFʐ"B:I$8W `?,)7?7 ozmaD7FЙo0]_Eק*WlறYF'[rg|KE1p*uRR"L y$(L!iD$4"3[a2_c$fݝeL!y3-֙wl]gC†06ӢiPR4Iu8kRiD5cv6=]:P|g엪1cY2*td=Z'/ r V@e .:1n;4Nޤ 1^'œD)D]5FfFZG"҂4,( 3>FY|iJ@NT(f'. pIDVs oy8! ^\caQ {|=N2%*WtLG)t93}9( wUy6ti<6˧T\̊.MXYL!ń<n:NAX̺4{zQYjjQnv9̶ vx@Is*N[ۙ.\prp▏)%7)!+wzO e<ڠk?2o,Kٝz7Q^݆gv4xitE-xPv v[0RmnPm/uC=v)!44~Ї 2,4"2ECnm4aB?uJpQa.^s4d:˄!8,e DŽ#e #uLclas.XIr]6X tSҹe师RLfN]SnwiisCnO8ke>-dMȤH$M4FҊ?WADI$:PZ)j !NE' [ȹqB š5!>0mfX,3NsD1"K lTgfꛟ-S)6ؿIy)`6E'ܝqN$#HW|3-+%HZɗ~Y}7d*$Q8YPjP#)}QsӎiigLF< ^V8nF"7סDʦ=i xk迼fCO? ~,e2ccRw}p% e̙OX |Q3PDuඉ4x¸UEuٮyc ӽ 7jl d7(^q襸YjC],.s?ܯc> FfG&lE0cNؠUkq9Uʢwe^x)=J;;:XW_!Z5u1h* -8UQ$KX#EbH@0$&RiSD e j{s.J}sg.ki8A˺ Zyl0لu] u𾖬Q玐ݩ3#Lh|rt2 9_>P4 BR݆^1|Az~oZvwjw74 ORu;q}ߖ7:ukPPצUNu:U!Z5yh'Mms>筏Bo  Jd6l/;߿l (ڥS`I{q oNPUp(~P(tnvY/{Wn\C#he\ay\\o.T(ݼ*5j.ޕzp=b#D$ 5*Ai2J$kaB&1N Ki)ˍuf{tqx~qC f'2X/. f'FqFHSLt@G"MGgq{'畓r%% 0 31F&\&c,0tqN!؂a\%d&e$)UAȁHKi(Ԇ3A[3$L(!% A Mfx:7r#mDJKF:!ĊP؄2 CP1 grQ#P' 0 8$,FS@X8S` H#:Q*d{E VrT_r_ΉdI`PHF: u(AHU!*2ID(A c E'O'{2Xh$q%6 (\!DeSM(<3LB9[U|΃q=/x.歹U0{V+(T?[\{8h~0w翥a UC1hYV%'c+bU%eX  0Xa LPhXX阰Er1LGYu|ttD "MKB +BZc ` 'XT=;sy4V>_ 'Iӻwxr_b38C0*J0b*) beaD'i(}??-$-]OO".mmasp˗ v\ܞ~ڜT5:8*GVF;AvKY97٢a^k8^.?7p| n8<狡NeGu:oF7'] oa2窮6-.up#Xc&DlV8eKXun5AmjEqOo=t|8th\[y8*.n-&O缼2|x|xA$xvW*z=@ufE/`.65}2ٮ38njK)NA&W6O y2lO*2^ ʗ2ǒ<9ֺTydMs=8h c 3ak4 zҮ d!IF oԕɽ٦ii3"YS>hiIaP y-rԘ8Iߨ5r[61s/d禠)& Y zTLӕf }Qms's(ywtx7-i5xFPjh)]O_}h34w=/F |Y0YYy?$ 1dE LLs"Nζ+ݳ7yjmxtTG!a49ќ8xbg3s#:XzKxSz 9 ʟ4NZm|uYxĸlCIJwy_/AI E !0"(B"Gt2Bxxa iC<1JKC(-G.O{riQZJQBKcV~,yG_wlGJ1C06ZCz&4 p<^)eWe.&t{;ᗕe[ݬAدBmBKk`ͬ "WoX B1\*d/?4LSXzQ땇ٱtQKԼasM' _Ȝm:tVqp47-srcK$J!%6dUR5.,=M'ݟ10¯&!33(HCFgNl}hkיMY e02@Y='g2(,&YQ KZX3AWҼ/T1֖*1H[,/tB@W0{h;C4>w3zY^ vD)dI^y3}o6l!:f@"$ꖄu4mʪ**hBw9'r;ޏݘrTfwq3MN I5}qv>y9’735xɩ狾c]Hy~9$,S眣8)I̜ QǨ$Q֜)T/yl(X+IeZpX\jȨ,%ZL*P*5 &G'+#Cm()(cm}4^)LCc#q `RRBiƤr ^OSP "G?ZKi*dh%hskQNb00򸨼3V|麊}qf-=4VbˈuH9F;RPs;+j WJ5>"lM>k]ccDOek\{<0@*{*^@ 23qlY f Ɉ+jY\3>tTHtO]-O4Bw6Jq]Öhpf59q6(AMHZ_gJŌ*5%*rƉGaVD}\(NDܺyL;+cyX߼)e~,3nN9 "|<[? KkwZ{;M겴egyкOI.NBvʼnԏyE{`c?Pr/˕W+S6t&U\eTw+5$lY=62Ð$V2F2VBj* t7Yo:BCd-:Ş5a<Cy_TqDRSk8Zc LH5Eޣq{P#NѝhyzGU:K6JsEGmq$c9èsSKDHXҫ3Z/k#Ew=I}wgR3%GʹF)"IV<'O",bRêŗtJBg{,]dJI9y*+1 _*#ǃbcL;o4 Oq|@:"(REDFz bukyyeo 9?TnIb$FD'ICQQ/$BpAqT@ZuDuK 罩яJ}5|Γft79"lpRzvaLɍ">#0=:f ,^w@ak%r [wZ{DX׾{B1TN1ʌ0XyR IH<'m:jynO<ɛƷ[ 殼,o^̴D𫒺>CElb(Gۀ$&bNHAB@h\_vp*˰vDƌR-KAi]->,B;40KaD y̰:3v=t:O(d$Z8rҔ&0W1ɍ1Ⱥ5DY(F^D) qEBwn6 =4-O;ISPp)$EBb#,0hO]d2.P -;A-$j̄qr N{SXoH9m"$zLXtheYْQszA]]<{e?dҡ\^d#bq˯ {F?+C7ϝEqYrl39܌?8S_"De(|)?1}s A+ݔܸV:]aqhcN4y+<<gSdY}LG.u暣^ FFH[;_`(γ]w?n W3>d)Kp%;79j s2~OrJ|R S1T({ӕ ?khZ(-xhMv&fˋᅝIkb$!/|tiI1T_50I29;';yDgD{: sd,!`,&A=MB>;gEG^GܱjٸAJFk$ueR d7q+rj\1V2Ń]΋# lp1Ѧ3?z$XX: 9U|Ppq1N:sX$)gټ|Js%>fok9l&.r+KÚLH,K\XrCbJ6f}:!TH&W)tu*W&?Q+cDkzpń \CwDWZ;\5\=AkW\ \%rz/J>zJTu~WBR-W WU"W \>tnJV"\I0JY)±֗K~χWl 􅚴E5qW`m= K%)HiOjF Z{#+xGMT77ɀFIbȔ0Ap HE w^xpDfNs*T!M`:8 '{s?'*/66xn6lFkSQ#ᯡ8"𾔣08Q4^#\`룲;3s>fp\Z/a)[?Zj^{M]O7Cc%vNnv Eg~dggws"÷2v0#˞_9cBYJy&~caw nq,.epuSP]7HWr 6nT\:.zwus߽w7w?B'[ӿ7۬׷[zx6nM6qewE}lWw117(y c oi`1J%76*LL|t/b{2r1$*(@$zA9 Y"HR% \H܃VT*1%6.-/mUV`{X*9eH`!ۻ{8ƛfcެtBQjTWKd1ʑ[k2L8@f KXfy]j7w|o Wav+vngx>qags{ <_\tPD(€4x?n.'C]<4 u s|!sa^:a{NOnN½s1(im.GylJ dmkpΣiJ0ɰ< o-]ce -[|Ӓuc,|88pWQ]E]aR,,˺ y1FQUL;լf#TRV=ha3iss@>9 ODN.K(r\Z/QK}͛~1 Bt\^v#S'O߶}PIᎉLfpu|ՋHE@rٕ1,=M4W+I72hꃝw'yЭ.؃svYŘ\4s@xcl<.Sϲgم]ܿ&6g#EIkxKn_TyT5}Z'ŤvA`_v (.D4޵O/`jV-Χoܖ~>nUE\v+ՈH-yT2%ݔRMtOjO1YR _%#N/:Ģ( 2,dSE~Ǫ>+1.qQ:Cghs%v|"#YC13Խ5Y bKJ[S&ӂ 7I?YRզ 9-%#A,jMB2tfTSStM1t9}'!g,tk9a3ųLԠ-؁Pݱ}U* fcj4taA\Um xơ֪W؏lgTLӿbk]oqi=`&3 gЋ[e"'&IUʽ$ޒre ̩2JSb&{hrrc6Ri^_2F'b-!k-4#ja%-y ZALM#TTT\YwѦY+즓3%A2:JQ0FLc7_.t6$H aoͣ2m]QD8➰;߄ Dl7%W,*/ y?T%^|։XY&S7 )^K-iy,`# _XPUyn }r)]R!JУWFI"h#RvZ,ۡ3t _@dd#npELFsN4lk?.;dɳ]ϓxu>3W ][4`䭔 y˘*3ZtE#ˎ&׹t 3am֧w *(8mOh1(<~WX%ͩj{AUˋXdAmH+\(Du\Bn1l VռhM2 YԃC ,\̓YE񸅏"v"1. tCkkc| +Zpl6fUz#v8B[@/ua%]%p|M7WO!;EWd1Di JCN $vix6” 78#:h9hI48 BǗҴ=fּŁv^^ f!ZG"l89Gzi[h{v0zn})ac5yqC8ZΥᗄM mmmކ*U9km3Mb+eVB<0׽BṫݽqFo54 | Jc|*8ƀ|C{N'i+ȁo90lA{o5-呉M&XgJ㯑n%2DZ۷n(BXlQ^fvzׇ3kBp6(҆Q0kųS[گAa֛3ǔ !>`p%7"f+esw|[dY|jQI=Yr|IlzLئG܎ 3P-GSeߴ` s1ٚJ'ЬJ,%r咈L п=Sd̏i'[Hu4FjfYR@rJU=j~0 ԑ5mGS,)hg=OYq7 a؎j8J*eKet[#9N;64CWYU@\u6$r\zeak%T t̓&ۂ˕b˺q (SuS =SȺjHSLcFNQu˺喬aIh;~|. v^2< :tM/4m#]7ϊNkB뼤R`Ԃ p$6]I7W6dvmqtum_yi-eY#ןXmBΖJ_x QL/iƣ;^*ZAΨO'U'0 jڋv?8Φ/71jJ&)j3'|}=ZO҅8"zorr/Xv_MC4<1H|e=2TwamŊP(Athrt1 zpx]hT =BVygq|Nx׶ m )avahRUkEfw16W&2;" ]D2KFIC pTUdke%<=`C}4`txLK03*cjmׁe*0ಮcD FB^}H]v6LyE? Ǧv"'%"1 7MSP٥Pʲd;n>NPFl6گVrw!WӭӭSj,[4Lc^[1Z}`)1n鮒e]"0p3uyCGF{TqkދuL>a1Ӳnk jk ڭ^ -~CYs" nA|")Enw*hȺmk2V)0GroWarYVFիI VrVkYZR:kMRϢ'me/JR!v#R7N*xdx\KR} D.H4-f TYѨ;@s|U@\ÔQՠ \VuYaۆi^ * XGi{] eL H$ \8)Epr݁j<\ &c CNJd5!sXu^.ʭ߁'fڬ;.=mb!h39R1bWa iYl˱Fs \Y|v1RH`:)U׃m5Y"%¢0kM[" zF14Q N\Tr{ٙ`+Y5g-n)WijILEY=b?lƶ*^w^)ӿxXɇ>LO\YU/'7[viڰa&hf1J%O'{\A2!OG r8[k) ޟ$Ė!%Dr?{o'wKʘ=g% p=(׬AӝW4l)#@Lȉg7v iҿZ};wpW0؃.I~ [7h.E͎:șoJqї_A*o:tyaލ*/_5"?L~!-͕H_+K>K;h ѻ\r!̍*>3<gs^iqԖQeO"R) D*mD|mpNpY*۟Ӈ_}ٻ?>{Zw]_D߳I7?㑫3@2mgɫهw?^||*b[Å*eշWW=IM7*oW*w1zǓ͠z<sX%'!OcUKA~5]VHGlmRZX;:׻^( AL6+34M@r"kz _;,Yv9a&a{ٟC3J t E0. >I1N" I|Ae.@E效@Y[A˃k$UKg;n\g-V݂|˔Hv,Z ABrBjDT`'-~)9q u[8MV5 guأDl57~u~M{dխ*99_: g_\+df3sc16[p$a&`J$fd%k?jyZaESvݪ13=]?VQdt=~S&[6ɘxAٝdiZݷsm%ŹU=@nQ5'5Dm!ER!:]x^dH<49B`"N0 UZZ T}'}=mfC^sgŹ[\lseu]_;[s/;Dtk}s`2\)"Xd^eap9"ܷk_qזܵY?&zʛݎb. b5nitס07;;zC9멫J'C~u̵Gyϑ&g\y-|(GDw8?yz!A~TLP'ՁLS4Jf9YH-3SOɷ  ikz@LYITB j ۖ rgm|M#f?ZVvcai?T}7d:K^&+eQqbR+oV N2XٗTB&pU/6??mOY<ܖAdp_Ht j!F(,W7ɝh{=h/f|.z_5Qo8mv<?N'ESc%xXXoG%~ :8F2A4(*L^F:D 9,JűAp-[LZN| 49HWs7d@`-6w`Kާa// qccͺ:g⎙bnxZ^ޘWNJo"V[j۰go&}=-mȸU]U'Iʭ;Io(V[ƐDRyW7wf!7LVIEdͽns޴^Qb{f$U!U Tyypk1oc ˤ[ e@5.nvڅJ u" tҡ]W5s{My]u E;ᘷ%I G_G>c Kr;Fѯ>=Ӊ9ݘR"Z^1U ¾1J 3-InD;R4z.?Excwru\Sw !;߬@#H24 <3E<\lNhGo{sΛ#DpƓ3*h93 L hfBpXe'hN&1!F6 cQ3%V!祴QM)R K2I#5'9grH{1âTf[@FU=o"+p4/T^XuH~vpX2EPhvv =KӾFnoiWJH'Æ侗KѬSz_'#uۛ{r%,+vrI̜솨cԂ$Q֜)˔w4El(X[0 "kDdT-&F(zg R,'[#gˀ뀶U)):)uIJ `_p{c$D@yGeAR)K)R!4cR y_S I#3j 4p'>H"0F'VU"%hskQNbOiEn#EҥXhy*|ܲb݃urCcAcmVl)p@T9bf5\WJNlVy;/cWKJm{G &0968@:zaZC5LӦ-yHJӕkh$ Bd#T33#}Lqz٠LSYs3YQt`N}/{2vff%\ 2KTZ}WDztxoeł*UԟWۙ ρIg^|*oo>>{&_^}x?wd9i"AIxJACKDƫm3j b\|q[Kdw `m oUJC*8hmTAqM)T[Ku"Jκ 1f %WtlQcg_gMqP3b@6AxQ3FȌϿ )Ǯ4aӂ+.LHCZJ(YށJc1:ʰ3s˄8s?FKӽ|A`nn]S/ZImosm ~lG,Ѳ7ƒc:sjOO/?f`D`}ڷqX;NWaAZl7U7WQcRՇ"4Fb¸ʚh*ge.u@N:]K֢1(& <5M, ,u,a"3 A`#v X딸Gm פc=i;ǁ\pe@``YK cU)HŴ'VTc|zU!lO@RdN[ԓSTFq"p(&Jc^9JyP3DQ@^Hg'o.9! :(y`f$OxװgRpA /hN^.CKV[;dj^kwL^v'9f8;컾_֗pgD#HRCWZjSyKS&#q1nod:*ʵY?'T:|Y)mGmUV4Pm@BA0F]0KsN|otDN|yhT׺GЌ9M?OgJVי|Q|'_fViWܰC:ֻ[6n$v[ijz̀?.& k5?۫MwT,#nͺ%v~QCݕ' mWJnx>y ]Yy;*&\7y|J\Y}/W0xWx2C=8*{ /W<_j^ϊa +"TjdIXx;ֳpVIr|s|W36ơGo?`%VNo'K|Z[0gkS'`Y-XA[֘9KO7 wxܸj >Hv_% >&xvb4/n@vmxfžgvo̕C ~h>Mm ͟X/^As ߙNFF7 A69 tn}@dԛnҞ ;S("4?Ϗi%u]%>]e9LRVq_1Ie iƔ.^jm=bOj2ȶX*mL߸OŸb܂7_bv FT~ZmUM<<$ÏvPX+ʂTH-k'AD,Y aoVH G- R DpqÐh+*DeDqxQO'Ս#ŵ٧ĉý*81 N\'qQ XlZ`,F+k-͕K+SR?i,*XJvbѨJ++'$b[.J@aA(pbn&OE6O9_o4P(BÒXHd29^PBL&Ri^Yʅ4)jz}a#}]wN%}$7<H:K&&E% kHf9,JűAN|[LH)P,'.I|V[^^t Jf# mtW% I7!m2k4_O+yq*q]T`-VY'yQBN\@*˔@w2;9zr&Sr}#\^(dmhŴь|(W!r2߬.gYx0bW< uL kfZ:؏#Nx'1=,Ϲ>|qF e)+ӊhN8`Y^ù^JlS0,Lun/\wtwY3|eng2N>˳/5Ìr5sR1. AeTL8E)o F ,"ުcN8o<.J峓7 !ym&^OQʍR؊`*YKa}5t<[COf*@2;ѢGpҪ%m2a'zLꤶ7 &ֈE+ Ffq^Ke$)X"m`iXT$ "$cYi PC["5*d&>֜ݐ/嬼yKc Ȫ;0i``}o+5|umF[0!ؒ-=w'=뫑C{E dNz/ Ⱥ%=ߑ'NNTmJS=; !fw“) 0\D(gUy }zh w!ﭬ¸22 $DtY PXj$j$*.j :S$ϯTƄJԲ@cD@H *[sz*Z9UWrbϪ^ւ-F)ǩ.RJIb`KɩFyF2 WCDoSc/KM0.0Ha4H!bX$@ 7`} .Zlv]= sT%\K׮#r cuά#p9~iNs檛Tu4{ٽ7\MX-g^uFא7 /SKLp{L”⋃3O/.5wťRR+.|qo/Nk.]޸p \tEU+0Me]T$-C*~[j˛@Dq0Z;ZBTi4"+KU`"K NI,68gn7k%o@^l89e[ĺ]tٽ F9mo PY_}IGXGl80l舥` ΄I? ^9NB MщYt9B&m{ do|۲tYoHzQWv%wI]}t%UoI]}Ժ%wI]}t%wG"#,K77 ISy^j#z7R>0e4mdݼ؃ "*AcTGSYQZ3CU;KbryPOOI睩Nrs>tӋ^/?hVSw.g;,J*TRqVs *jeƃfcyAia: UuT;hxWLWY qb0ծ7mLZK9EՇbe`$_uMǚPhz5y 67=oW暂bj!{;U6.w~=)ZvZL:G4ZIZTQ֥80)hu2#&Vfi8٫g}1, L>3?+q5 确`M>f}?/7 je{l7(6# 1){PU +Ѱg,|/ӲP0*ۖEEekc@*z`ԈMW񜀱_< 0^VnԾzпca8Pud7j[n}ȴ%jI1m["@#uX/R|#5Gbc %ZjmYsi>^0"h⠔ Fro*7/Tlb݃PKFZxu n}ͼU| K"=U Sf TTkYi0CL>sV+fۯ =OVZ F FJVN=D Z y!6/) y!6/慞PbBl^ @!6/ؼbBl^͋)ؼb9*ؼbBl^ yQiؼbBl^ y!6/T0`%)0\ +fa .L)0X Spa .L/n_KDJ[i-etUȰ1WUY\ xNZwIi_ιNUa_:A[bHu%+$'5x<3 +u9oxȁFt3ЊqͰڝsMZf~Ԗ8HsY12f) _V,Gg+:.c*5=guڄ e^ S֢OtznhLh`s|?J-D 9r鉉jMڳyZO ֿIw7]]S/ÙG1ȆTT3uyGݏțz_ Ԋ{6Z{mފxg@U*Ho.߯ ]9搜!}h AkG֑dbiT,k#;$zY7p =uDvBO!$ͱ Kf"}T$0:,U+.F@曆,mI+Ă,ZaVXX4HY'#gWX^dj`l_T?<<{N~&VJgdkG]骮!je,d Duu7=X]6|:0z5!TQ'UŹ&!X\P`Xl i=x.Y3^Xm_Ǻ* wo=88XԎ[Y|MJ;2!"&|Jz]wSd\(D0yT3T!cM5Rngxid{(?\i wXXͺO?Jh“ X*5H+S^5b9>&:(:E*GWd$1$T eߑ%mаR)CT**v2V]nLTs͌8;j):L2=4nΤc#EJV 5JcQ=ZMQ]YN`ݗ3))V@d=tliMvLM{vMӤ|]Qw:iR/u T 7>{'0[vW#T=^ɷT&NO~#EomNrp*z8Eoұ22MFSqR~ J'NUAt -V_-3;{7[$UOoKM*2+,%r~l5*fS.+g˻ްG t9xڒqq>q :~n[} " =¸6;\ Rޣ"xfi)0hS$̘cZ)MTR񜒊&~fࡩr,l4eHӧf+vm*K|6Z\gٲ5w_wzC̭\^=˴_o>w8kbɡo%&O0IfǖMz3i5Ҳ:kB٧T)uA[) FI#.O PAw_ZVےqX<ꡫOV#iʰF|f3&YI/OG/W{'4sRI1_O1D֓ѫWVI)~Rm M:H;="={p;͕M\jo|'{n$;y6fcX^/zrdlOom}=h֥,yr,n;姱,Y ?y {~`Fw"|:-hss3SڔCn<ͨa(%cF^Vk1qb/Ï_uǛ/\i vuDt2碯͓zAʻBQxiѹl½l[R>ܖL8gyJ!:ijۄ~35mlԧ^Sf EM+^l- =L j=#=% J⪠|cx"< *@>cFG)fx\+«ՙ"_wJ>(ٙ;ͻ+-.㹬`g}drl"kʱ&H(R!ޏ^NDXTj;`4+bdFmTK2j&U(w{S~yl̯./ҏç;1#Yp<ʋoܿni_O|. ]\7$׊uWYBh<^$^,='66kWo p_RJ!_{XOA"߯!nSn?y:>8iNIP΁e&u2:T%ih\.2EmkjY\-ZB1BW;邡z)45T^*[] )y[@X]zS8Ps ;nYv4Ye%b"@6tK!תMILRsuE5X'XOv^p_؂}Q8j˳v_5ARl9ÆyA3rZ|uY XΖfYNvfBcGAc=l+Db.^LސԖ$SD*j-SچRr&'zBuQR;?Z;ƾy{l-;~ۍ׽bv~d:PsBi-e;ru0y .D Mof9^Y8?ήfY=s/8o9-#VUYm}y?-\js2=XVg5翌|dP&euӌe>}.ظ8U[=/̗3Kv+Ցz5AP)\p佉"5.LS/_$rJ콵SY"ZLV8-Qgiy,[76 ZkjbvgzȂ"X[CJۘD ljSh ^"CJf,7G<&ϛ;ѐ`@ E<]QGzyT`| wyc {3S (#ZA".L:Db~[,kkdaU WC!F6ȕЃyof%TEΖ.l5e" A|<ϕ]k&4쨭 0c(`J{3O1g0RT\I,2"c Y! 0?Bͼr  _#V!d`~+̼ NԢ5g^@*,0c(`PyRH!{+cRKMFJb~+APEa"RbpVCRS8 =7z358op62Z!+Ո:7`P<)84jXhBY+10C4G"UxݪXK(v-cr1z0w>;1Lj=)ɶfˎFCM$A -z40%9fB5#(t`^4̗XgJdEb@4*#(`^|7,G۬5 HEㇰ =G$/i`Ge<108qM!|TeW-ڊ@N EƪP'O.;j]ө{\>^]\U )1#TUS FdHFN-w·JIܣ_ǭy[ 7%M lL&o5KCd9fSW;R9dINgR+W!iπ u ZLQb{~Ǔ v.LQ{{7eϣ<}{@ÁwRtIjy`y;ڳꡒkW-3*7RxUd 7p;Bc#{'j,յl,")%r+ ЃyEy-RD!@ڀ`~ [ez7XM M-Bs'L^8D?% ̏ЁyRBoѷs61*L.*!VJRqC̏Ѓye;.X #֪Jde(gyc(`A䳹z@ c v(l4ϟ8Ɣ`~̓ojJd$i,&i|lKʖaAdEj U05y Pc3?B.C` m] UCv!cub?B N`Ru!RBٕ@> =gݙ5|Vȷ+JJN!)`-*йUgcZuC@ =G֞MD# Pgl+vTwPW>(OJh\EàHF,In.85c(t`̣aQQ3>TDK,t+?BۑIwkUxasEp`HsBQQ1?LZ8/oȜ+_/BkZ4 =p:=̯-ob.&tBպeqTޚu8X4\M;!<;|?^POE.[ OǠg4 3ΑJqa w(1pGnyUUjdT ʏЃy@$zEvl{Nl,ف[V"&`{ʶUһBz7-`gt3[\nd巳~(]-eWmn~{(+~a~.]p\ aP@J.fs,iE Lf:h IWWpʕ~8]$ƈɔ{n^#K( 'R7eyՌ(Fe|kI&ZWF06e0oyq;a7RTdPCJc&@B1U_cP.b -Rk$M`r?k1׉ʺڔV(*=:0~5at%KJ*SV,TVWT>oɀ*&]oGW`R߇`s,-/~BO-Ԓ,g!c( vW_U]uUEk5n3h h F!\zЀ[tKzBE&hi06F_=ɮnVm ʁ̶t,'υX}.tԣ:VCG_B1yRWFW^>qCq8:_Ls?%'҅:/!Ey&ΫޏaR ?W{C 6N}ҟY{N&ӴJi>Ta#D$$Ť5z iFjiMv0:՘ Xh~}2!V?y[=xy>9~n (m 詽 b[fqx dys_saWقİ^Yk:]U ]]Moii5aS/&4 ,4wڛkX8:kZWՈ:yU88A5*Cᨊ"ݙ8$H\Q#UV<溢*C/|~r_ٝ؎tYJ m*M/+ERu޸p"J*fApcX$ӻ7^}~?wGW?:z%LB:9X%;Nh7W?ZajoV5ӼM:gޢ^"r6yEV/`#yݩTv[*[U* :7Z Ыk93o"uts^ caIU&=;6/Me9Gepg#5֥V&Gv|pxHk{Is;#2hKҚxJr $;46zVF^/(f 3;"Ǻ: Leli7F/c|Z~5'=|w1G6۞N-CD|JZ\u<"yڇX P1cw."0&)k#utM 9~驔XDJ׈SoSV%H)!G M h<okSBKrs(2 QLpF-lAҦ:{(EyD.E0Qai+ZM 9o ɍ̣\2Y YhILM 7^`EXiu!p5FS03B)!C *w/SHE\ Y14l64E 91كjg$TQ ^e.VIYu%lrB)ZbV/uojCRI~4/ZDXei& lޠTV;-GbF :d%{'q;Ikc{v'cs!pEP;?]^d 'ˆH{X}z#+K3vSTptZ-4V88e:ucԚH b*|Ƅ%ugyAVON>euS(:OU2LB3C.,BǗ,f A+Z/fJ!k (^/* ?W>$3]֑E M08HnR8u7oŇKUlE$A@BF_})R!x6gESթ Qvc̄[;~g̲aޥg\+xNBQlpIjZ?Uۨ~{}НBlH+fU M4]׽7Jzys%7\?FfO[f΃ni Pꆂ u]`lzMоԜ5jˋwSe4 4x+yt'v{"-;P]R~ e?w2Ô2.ҥHQI%E4p-sqOd+ 㰈"zz ]iYBot7$$QrLQ WǗ{Tڪ F7A.GN:S;Șĥ+I%#cR#tdTDR Ρ:KͩS uشq29r^%*Tj,a2rϦl+`ӥ84ᘳw%fΕ ,}T~+b:H1ŭJbep ˱RbNt<=_nǗi6]htڵM;>YVz+MZI?O>gϋ0IϠОC/BF\tx1hKjO3ܢ$T8G-^`ArR\RCZq7.xk x"%%)-6Q!J,( 9 _VU  e\Q6 ?J)G@Ɲ^DOQ.rQ1{jjL(2&ꌹ6dW̵ -Ï>Joo}:Z+Jvǻ$I+}$هzk9cpuaV j;iyuB]-۫=ZB쌺JRu򱫫$e#^]=uE vH]QB"QW .Ţ+*I=JRՓTW zStI]J.u+ ?vudfN+X$T]QWIZ H)epE] ʕ!RW 69q LҊGjOn6vI]%ꌺq팺J5u]]%)ݫ'$pړ^wRW !걫$eQW1!@`IC3*IbwlSSn AWI\3q;) _-`6(jq !tj+)9}d!^խ>%w+#Bt8HRN B |2T j0u]'YA_Oϧ]} :$])*шklD!Ͱ`^MA+nz9~v?w@ȏaɧ~%Ǐm*JE6+p<*Պ4J BQ \B~@ l:ġ%2KPiEΒsA-mڠ-yOĨ4:Al#AVuؾTSsW ^V!A6 ^NYkJ{R}Y*2jbQ--#m4sVԱ\Eaq.ş1xm꼥\QcY)4`Dūyϊ̵2 Jz>;ܠkMQHFly󥚪7Z^kY~i[׿niD 7tyÉeTÙ%N"AfΔR0/a>/7D>|hA=Ԛ(|'X~}Tu׽_)BTi ڔUѹ<l:(~&3q&i$/hجKt%8])?﯊1@R'@)}~|A=e=]?jlygunfh+?m3>6ETu9zfM}ռjqu7W̶#Q玚!RHMͮnMb#Ci.8׮7o{Ziی=S$MO"iBT2ߊ"5σ=Fd;@7 6E*doRbĪmQ{t9J\0{v6|E12lR9F8NN m:&y %zc[K^^g@(Xq^dJagЁQ#hRNQc*}e3 XDJR!+Gʠ%akw+P+#d3 ˸SaG9H{QP}D nq=t5zMK },몹IF};Tkկ-cNO~#SJMbwqsqΡ{ŖAq+XiTﰻ~,maO.a6=>G^N( yO45l~Ի.)65Z+Ji[K~bcUׅwZL M,ޝ/6_ʹp"Bwm$W+f6Cv3ɇE dK<uBɞDejR-e6fF#RMuN=5G;XwF>A7ohrC~ sA"!*CIs͔'VC/CISzeTAPg{_u}+\A:Q3O:F\yo{w"E2}֙}v.seu0K8²,A]eNc Jg|fp17~}Z;]0F o_& vv5^P/zu_n+Ի" W꺧N|9w+#O<젙;GXV/eɷO]Vۘ\ .V;hj9׶ogDrѭP8*Jc,VmV rofDj+5g;Lj+м!\A6z,\ALZk>WPٟ9xpe 34+ɮHoZ܋->WʩEq8~p\g\mVj;66[JOrdC`t3"ڵ+R<W脫ZX XJ Hfz踂J݄#ĕ \A\ZiWpEpEa>[ 7+}+R/R~J#ĕv) v++ɮH>bT:3LpEoWE3wI:vRi섫#ĕuθA5+3Hj>t\Jk&\!NpEiWۙwEjTNW+nYC`PvErM3Hg+ZǕhzsʼu~J~gtkpNV*4[L1eyC`M3"´+RTj6q%p6+o.s"V+R>L3 WG+saVWz!nرW; ; N;!N+ݔU3+oBO;4IVo6Y=f22ڰc $XSb\ Rk~DTZ9GXbkfjfI HQzw踂ҌWnd|U$7HߞdF~uo韖tsïN~r7gW/9W/.5!ݰݝ>!Bj(/$}JhEq mgjL^Hjv*purB! 6Z H?Q7ӴĕZr _)pEr]3v"Ljpeޞ2a{Vrɕ{6瞓 W۩4Wvծ[;5ԺvJHey*++|+"1J7q%~QqVfpEr=u;:HT %މpcj H9x\J'&\!EK%ն\ZWPpu2giW̿˱pEr%kWV"ev]^c Hy#\i>i[4=br;E 6,wι?ހk+&;#Tt/g+"|;:s~xtqDS +uW|H\\(?*%H\]{-sACV8Ǝ3r't C]ISW_|g@ +$ZBvx<[t=?Ãvz8p u M {v\R͡O Jޛ6M 8Bxmn//b~Qxs+< HsN9O߀ͼk*~v_|L93vsH7>ofwxc*cdWfWL xvϗR=1>Dl}"|Zejo۫ϠYY^.]PM}SfȺfWu_қM{~rRxp_~{r7V|f؝X\'Fv%J;cUVTё;ҫXYW{][|絳7)7+pϖW3 >ͫcMT]}C(4` qޡY6ta7whs|c #.HQ{M?6a) (7e˻WzN9d~&m^s1bB(5P+<*;KENV ZGf\MjQ\3Zv9x(M7_k^-gr/.F%Rc+3fדMHQ8.r%Y딥Ȫ\;?&S%;K{D=MUEeCX͖39}[b%l'D3k8]2k =O#>*ϵ0We)]X1ʳ`jQHzSlt'%R aYC": :8j}*,B1"sdϸ`$lEOO)p' ՙ{9 fdY)9 $SR[,nY.Y5ޠGx8V E!>؝ xAz<TҹjJ7ŠM{*0pLȠ CκDno` ! :kkaS>w'=36pQyk #-M(RU6,4Tz86Jȕ QH)48SQdCVp,_(F iuf=1yH+ImEtkm02VR2)q9.t9e4yȴ:)Gw XR:_HB|"h*(߀KE7LDj,)ZdPGxPd Mi&$xZ$aQ[6d!DrAD}@7չ8bc$,%A[`qzCNcoID(4!6T Fm^Al1s2ɍCZ3:kC.:jsA\-b%STht&CJ~*F"2'dy^hl!ِ|L),Q2XS RVM@RT K.n`!ʍ;<6 G)/㣺rHdhlQJGܣ#1E *>lKK5M} WSsJF=r 4UH0 b7D(PNJ9q]( lB J0Ѹd~Y,9) =G퐕_zŅih t5LCF[IHS$%"Ep: At HYoψ%Q@ɡ%BFG#)k5RSPztEL,֡4 %naF .3]%xA肣8; w=n$G<Σ*ڃxe O5GlђƘbS͢!dVTf/itL6@9`Z;JeVAG$&p E 8sxR[9T`ͻճXkjpe>òi, 3 2"خ8`Vw{|& \ƋCK !p d$#-ܧMfl-X*T9daM9as+DYpVb .3v򖮺SnageaqSZx0 2 4?Rcpu l9u+j7)77pYS!92 (GTc5|1 g٧כ=ɷeozoGf7WW0ɯ㈙jyUW8ϾlZf}8 ݱoJ+0!p| _9uAhӰ,Ϸ)T6ԁU:!!uCHBR:!!uCHBR:!!uCHBR:!!uCHBR:!!uCHBR:!!uCHE,Rq {FټPۈGJMo~HJя:!!uCHBR:!!uCHBR:!!uCHBR:!!uCHBR:!!uCHBR:!!uCHEHfx<# u =N:JBL CHBR:!!uCHBR:!!uCHBR:!!uCHBR:!!uCHBR:!!uCHBR:Թ\(9!uxT<UͳAZc/Sr:/S 2BR:!!uCHBR:!!uCHBR:!!uCHBR:!!uCHBR:!!uCHBR:!!u.y;w ZqvW _襏p5t ,ZlEB]<TJ`p 8?(.b{S7~qQ@C0>·5jiPtCd X( cf8l64 E bwxO+tW`QPTTGߥ&" j\쪀 kvv]>R"7;˵71,"x1U&q]Jӟ5NORFJ'WFSeH[+>ߚ8VA{ c7lDucwUMJҡ7: Wʴ'm' myPn69%sζFMPVg/8뢚71pe`*l6&᱊WJGe@seXUxGa!ulӼwtUVمxzχ =]EAqh59z6jeydw (fnq;;W,C)9-.ry,o_w/(EZ5nz5W55ߖoG`pf>/T>n5c_}kwM{v ftpܬWv6J\Ɣ )zix *?|a/GI ɧ&++#g«`"Z\]{`k^?ٞ' wT|יXwjT~k]t5{ӘBt7yEg~$-OgpVGr `23oU<./ۡJn; jES7E9Nki'^5LQ;UuNsI; SLҿpOaߏu>7t=] pٟwnlaqKdi2WĖ6WQ.Nڸ9*vi6[&ZqxyL=I;!gSߋΫ7֫V|8__ j'P]yz>vA\Ͳ|noտ?~5֞>]U&Wm:*›AVrʰJ|V!OPHg\H$}g-7r 鏢һI/u6Fh۸Y$,,6\`7Ke(uY>]}h-S&L¡X ^O $0 FՎqp0'Z&+$TSPuS\1S'/]ݭ~I xGOS3}uݚ&:}I~h%{x)T`̤:xJ_U4-jVɠ{#t\̹~9=[*$uV&L~-~)-n|7]M.YT_# Y7s˫n544.\ެrwΊxSUoHվ7ԽԿ܄U~_PQ_.4yg3ZW8^_8ֿuZȺ֖UsfoОaGҋ=?p)k FY?fi5>YY)YV2şǟ;hH]aV(bԈ{>GN0^U|MQEۙ^A'ӯ?}oH??~ww7{sw0N (Y:$ل_v2@ _-i髩bۼZ5YIo^̛,r{\y/p#1Gdw Q0?n[A~lQh<`SpbnՉc3&AL9)Vu~3g"!'m )+f/~o`؊.ӂ+.LHCZJ,@ҘuTvFxLx#o5旅I^zLDW854edHk-ӒZO)$`ZRhc.C#^*EcZOoso3G/ jʷyZgSs;J3to'z^6̀Qy,v9V`3%sCٟ ו.A)x[oVpoL dᄓWărWcw}1fC6Ue#HƟ{3|(xi-?9<:}\?!2Ն؋=ꭶg uucղUPjrjN5Wjo.~7L.VKy/1J)Avt%yhxL[M qb0-ۅw[C;kFt 0Zm4"o" `a  d(1ue;w8zXW=kҒ]1ڵc3\i\3 &_D۲[֚y1HX,xjl׿:7Ggn?ͺm-},l݌puWS3u$N!s-ɽ) WT#2{~AN >Ibe9ˆWRcȊGL x4o8j˨ǀTq 42Bp0Cs-a\~ %j<{܂)BR~ټZ0pfs]E~MWxdHߐd>.LVF5mwסpIo 6GT)^,(TeSL>{pC4BUu,2LGjB|Zv]~<'gmBaozw۴y6+ݐ7ᦌuRٌ4kOn:۩^8#ݠمPt3W'?E j!S͟{]nr0ۀwPM1=HrKɑN3 u8t|n7I<ϼͳk&>V }\0̴ qo8ƎĻ ~߇up<;+<`~xn_ka7+²fA.,Xe-Y:`Ԋquɜ}?wAMNJ^vjBcBA`B& Y'50&I [}r/+yV-4ܛp"Ѽ\gRyA2ɝ5h< Kl<do޿yq!AnRί붨͡Ua94[-3v?ӱaFlU 'KsBPNQF[+-EQǸT fmr}źgh# ?R1ZX)P|4x||^nyRn<ŹbD p>C91s6sˍ]!+,u0_B)/l ! $&aŐ'Đ a؅/|C"Ɛ 8YjqG A ` )0M~74t3njL7QP"ڙvjvzћyJ'48 P@iCs":8'4G[m{a4myyܭֳj{AZTzMV!xs[lٕ.g*fwв<5z|[{RB`0)fE)s&tKXkЮ5-57 !O;dV=Ğl k}e!n>!I@Bz%5ZZṖR ciL_ ^wh JE ́Ĭ1X cna"qt$ydҽWiKmO9>TH1te |N!aAPDr`\&.*m16q~-/m4‰ `j ӔiLEVX$4QC5y%ٮ%?\ )[_#]Bn?Rb&L^h  \%kxpWJRU2X裁d>R!A;\^ \i)zspkrw+5mWzOOB'LS]=܃ VՃe? \=Jş\n걟tB7p >0ߎi$K雏tdɵ*twX+Bc?)"̥dd:VZ] aI.˶dhUpz<|xQj>p 6|&uMJ%:b"(k eA ,&M*){% E3g$70J*5quJi2f$\p u\%J:A\1 uFצpR Zy*wuw+ok W lVtWIF=NWB !r@dLre6CVqTn,qu:ZS2f2`W=|Z7I&=NWHF3Ԟlړ*T[EU=Egq"[ 䲖ji;`Mh\:1RXp &*myhM wWIe z\$ c%炫*d bJ4W =vUS.ƻJj:ʍ=NWZirWp\`nN>W;;:3N~rϠV(: &?Kcv'ۭS/7 7t w|-|@ި5x*^{-Y8/cWz𫰘AA@;.?;z+5?0dnR2(db9\SۖwUO+ЪO3 WI*-hۻ\~!{\ ]%2\%Z+P+ :J{\ U&ZKp\ԓO0Բ* 1LHN(\%䂫ɮ T2z\ 8!BC)̮.4rBnьl}VK0(&BؤVw%N kπ[˳`ZIDĕޔ`t6Jr`ڶ&SiĕXv`c=<\%2\%uWR!q92+,WI.e*J*E \=!h?%ד+[jSٱr)+Ъ`Uv0XOU(*%J*qu#%2UZBJrWI-]UR)zqDTf$Xlpr1b*mkz*iSĕ&Zf5v+ @BټLj_z]J*)qfpzv(]j M䒖p4爵ʮ b=zL9k @.E2\%XwWI%=NW Tui_&??8.MydtfǏ?n 5A5`B5H ^Hv1ePo?,Jiwfa!%2͡q9B1y8K6-b9$⨂Skz\lJɨ8\ZJ7beXJ(cQwpB"fadb%"FƣXN,r6 BFԑ!ƠR11:XwoX;M }@X$&*(p[Dh[E*VB2cVyƩpʕS˩7Nto-H]|9z?-kbuY|x>nЗ1ΰ j꠰N; Rߤrܾ+0VYJ]Xu_rIYUWwyƐ ~W2]d'wrϯo}iͣ_o*u_X}p烛r/ r24n-~smǻ]2;[y sM?Jicn3*AfaW|I'5CMke t |%D hCd9e+Rql\rsR=O)*9ʒpn!)gw!_Tߍpr4Z?_Ena0te.-0/̧(Kw\?[xr3ެelhlmde6_5Rsˏ$/me[[|u|piW!.Kz`dGd7iqnJ6X̖n1kR*oK\)[H'UkzM Jޛe9ܭӽT3Aq Bʁ?=lj<8AjR7V[3a 5^ǣ@Uܚ[VZwϐ7ugU!t fo3-gRCLm ><}xD70Rh5r382 b[0ݭlkg(]rb~b1_c"Z2zw,h2a n:bܢ% 3BL|9|0̨hT L\FZQmB>QmFkb^|~kˀĆjk _v dMC^]/ֆTbf+ڶþXnBÒXHdR+^Fʙb!R.ᵇ `^ԌN:j^z˶,=0]mp/+ \\l¯~[ q5e(%B9! \#孕"(acH[ Eעf"ηgZG?@d|:Da\j^o3};>2QݜKs 9b#(Ǭ11A@B5BGSoՐ(S\(fll0M$0 ~wxr@# ~V(z@-#FT @zcXmQf*kF`B5v~ OjVkĢ#AƤ=ѕJj`!N!ڀ9ܞJ}X//PB )"V!:ǬH#T-B7Bb-y޷LXJn8"a~_yy|MnGň8l-D{`r:"Uy=(j$> ! $H"C1b؛!A:qzf <{0q14 N 7`} " bZQ?rX߿ŷ 7*I <|, Qb@ EΉЌkFmXN%/ow9djW{Kжk߼_vv,"{*GIJM3&5r@jםj! C)bY:gsi3ւi GZsm9 EZpX|טVG\K"HHF˴R }iL/FOdC;NV@%梇DsZ~&/PF׏Myrx{F($ ŽK3ߘ`gWv.]4Mvbj?'xǜ`6Nx酡@ ' MFK.UH.V)D#E6ܘ6pw  Fp0ZLjZBTi4"+KUsJ NI,ke-˩t8'2}ts޵h w:ٮxkNH>0D`S ΄ʼn`W!hgb#L*oO|)0>.MLivוΕ jqZU3;nt}+=B<h6â~5`inR)d ,EzwsކՐo4\;A:AO]=>,6.hP"i,妦l,GN، 7 Ml-?hAbFX-[x[x? nZ9  hqfL:ذ䁅0AA˄6 ryp}?o9yAy<Ɍ۝2iCV"R) D*kmHe @Àq:ڸ X)ѦHIR߷zIqHJJ4F"3WUWL##O ZKMv.q܉sCԑV8iq*1r%G&QJk)iD6Ro[r:. >OVEɞ@|1xR NjZl(LwXXdrQR:΁Cp`ya Bĝ{s[?s-k64E25d(o1_ \O\U#d ߛL;PA`5uGG ]^FJaR+G#$˭qR6dɁ}F( 0!!#%@iBȵND=lJ kcmp-kNc=#o'S42%L \ #"뵀;,9_9GAc ю:N8m9lL1ܐz2 煈*jK\+osjù=iy Y n˺ %h%P}KWߨK]^9J¼t4[.=1)`o1]u?u*?J%a&\-jXsEhie9ObR 'Fm@R(ޑ!PP>V,%'hZ BkN/\EXjŽl4q ƟH΄&y:8s=.F>1ʉZ]<͋2/;8rj4&|4IV.I,~bR%p<=zMҏZ8tgȝҵ_O+POg[UFӺzCkxڟlO/ʦOE?`2̜,ф$3Az777rNY9c3[$Asܜ~5p;"4=n ւyDp7:X3e|yF7pg̎1;|<\Ҭ;mGf))n COܢŃ=8vQiPw:Rۆ=yuiۆK<]5Z]u2t_ôr$Ls͐o`ZTƆq_;'ML /)yWE5`0 q};/\}T$R,BH3&(p…MZ_]Kw$#mtbU !!^6E$f VQ(-] B'ײقA΀:9ӡ6d+XÊjdL *e1'Y̬& yE4!x/dqpE7}2vYg4.}Dj cjPP1yՓ4דTѯ >-PU*'sK G7G)5ys7O1\.\H], /86\p}nt34JW5L `du(ppf7.8GKw.\B:-L!-':e.ۖpE-wuܻ)WDY%n:X^*ʇR;7Udb<5t!+/F&qgvdʀkK㬂I}zNٓC}RyM7ZrD sM͙27B\l2s#8!;9:9=Gm$%Q"Rs㣶81D !㐴\`*"uI& GBj?4:;K¶u} /ܟL2`R`a±PV1(C$<'OO,bRڕ/#%I4Y]dJIAe NGW#)>ND;UtNRI@ G`0h wQH >_(Z;A- DIԘ  p N{)2RNl$A-̴.ghm ni9o Tb%Q;18dhC4iV E&o Ey8Dza9LK;lidEK#2IC(la !qq5VwP鿦F}'#^'d6S~}X ْ,gO2Q~1:1z-HY:0;)"NO. ih|[pL!z^!4_NxRIum jjZ(-ljա{'t`b9-̩jͳKӇ ۋYuݼp{0 =7WANʵfm@$/ҙ38}wfVZE'Wt ina!`*&49,T;MM7O֓lqk6uR{GR\~~]z4lD;Ưp6,sz(졅s3YQ<*|qvMhIIw7R^:|dB<&Y]}\,FLU^N7(og7:&o?}:H??=ޟ &ߝ|zɿLCjd;ם?uꮩbtڜu[ySnǥ~}!B;›|U$mhq6hUAqҭܤV'cJ"xhuD߬_{/OJ fDB:!¸S9   #1c 0 Fs!&D0ht+ɘΥΚTp,ܘtோ/l6dϾA6vf&xeN9 39ڴ^L*2ݭ{mV( u֪5ePt1aį~ Z^R^yRJPu>9-T4 V> d(1uC;F{8x4ݷPx5ePFLZ~ Wr?竔a~pj33ʕvUVO5 1!i_Rx;(DI<q|Uxpe;Ml.~'{v:lEEU;e~/;KЛyM+Ez͹L|QpkԊ~XƓHUfrLrILm&_{|8ZstTŸH!'3|Rd_v_a̸ t1pY P rݟ, 4ߙ,d|g>4#FDXJsH\ys0\zPN %adz_;&ִ]h{5#S,JrP MS/"Tﴳu;W6vYP~ _69*Ac:'J\%fV1v8q0a?8aB"W/W-)R '"9ӞF S=ZUg~2q $M;)XU(b w 𿡈8"Qbt$cFky売EcNhb5aа0]r/B!6Qr ZLT&xY)+m$Esט@>$3h$/.\mՒ"v{mVIed&]EQ,9Cx4EZXeTYc@*h8eH`!?!JV4~>.?/fXKxXX0Bʊ>ӇÞ%.7v7/" ``Sg~zd(k2SZ<ȽCOM(#덚À^Y h]>j]Tw9t@ߡ?eвE-ݻ7z^5>y]Ck-l2_swꙻ<`K : }WZ{=<~ӴkΈ6ͫw~b ?a'| ݆8c솸tJRsu(10J 8`GRX"HvP{c7#:mR4Zj,B|WгgHӛax5vsA>)E?S'ф$3Az7Nߑs̩TdI¹EBq ɡN-֫U7Jr@VZktߵx4TqM+(7mҢfnSS!cvybzU=ECSZƝ⺠ m=-374{C;iE?y,#3~s[+Ůװ|anUN;?9V q~fQ.`P*%B9! \#孕"(acEvYQd*,/YIqm*ЭF GJ ȧW)_lEJ[0LaSțFܷw3~g2~g%ciijjY2b8<"w$P:R#,Z#0kJpLʷTR#K0DXSU* b1sj4H"0HYGHl\&#Բ 0L5(x .m~g+004C&[j>FwpKbOt[\JER2%z'! kpG8E"":egXGnW)yg1-L9 48H{ʈV]yg2^q/2P9E- Q(s: -= h,5QZ5ʛO ܳfm:2?,)XPi2Z Fft p)Q|نJp#+VuV>ڧpxt0"k[RS] ?'`%X-$]*1XHN1Rw!3|R85 =R8 RIl̄vS6-F!9w;p<\i7*I <|, QbĀJi!88co읇R=z* *ҶQy ;MCǴ,.R j_gj̾]I6)RikVVL-"aa(EQSgù ,n( +QkQJ0M4XdArI8 tȲXpo3P0KqfP:$y`aD{P6ojGi̴p}^6ˠΖ8=y%A`pNNǐ9^iw8j˨ǀTX*9eH`!xJ-5[o@tCNr 1RG"Zdũ$`n,ĕ  GD)a8HeF I!ǭXx1{([5}}Z-.Sp~SrYl['R2֓;Dtk}s2\)"Xd^e6q9"PQ#J϶Md-+wᰠ5BvO~ՠHX?S !$ VSxt$HڥeԱ&("4r`'U[g*4Q(aBB1*e=FVK҄’k>z`@21Rg3rhأݜ^z{G:f OijeJ(FEkKwXrXs2,ʡc-k15Mk"KK+0G Q8XGf&ᇙiy #ޕ`+H?Fg:oJ\#̧ w0 zs|9/U6W6ơ¥n/0&:6ogeåo}h8hc͗RKXZ#Xҷ lpEHn g\S} U_ D,(wt|mUO͡ [RpbP>VO)WAη/h^Tswи?l<3|/_T̔pv\/4&PlP3^ }]-(zӱ,i1_ Sx4[.=1ػ<h.IRdTV :y9wѲGi!.nY[,oeK;oKiK +}~dd6l,V(@ޑ!PH>,䍋K}#n C"tv cr7. &r ~Ra4qH&EZ8 =>9!IZ]*,X;!rj4F@4nT&WW +%t1}x2[\z; 8In{k}CF|FCr_7ph?w /{v6}+`nzNg0kXM77 EQIfo4!2nw䜲s*%g8H(n1X^A>sٞ-:90w!*uRn2y]weɿ{ܡby:ܲ{ mѫ+V^'"]WNZJ IKjn<$>LmK:]7]ez>q7aѣ9xtO.q7 IU(/0vIgb GB|Qe}n9ޓ60 j۽f <:ja>ȵ tOVV敘*X5-#狚²'N^XA`'=z.gYC-f~]yp<ʹqaox-gkʐq?{E;-,.-.䊳uj˖Ex⚌QjNR0lL$ *"KT~ g)irW1PgIezpX5"2* fk JM3HIQ)3dt}LhQ}S/ !@ H8@@y2 bB1<`)\(fAYLfF/ ?IU24x G)] oG+ aIU!@Xd$6q &hIM-I] ϫn"i6h[VQ) mX #/KGK9}ZFpgLvVl)p@T bNzgYM"$n҄ lg:o=zu6N _vu??jjxG1yzƚC5u'ZL+ќI*s#ʅF?g d#۩gӗz,3wDmH͍H, <!㐴\`*"uj#3gpsި7c+7:,:%Ap,9F3bQ H.x N0 X ł/#%I4Y8+$EF*ƕĈN.S|&ȸ|iMM񥳴vPA l17f8 nT(!%QTKdZʠ,fPA.q9O 9Ͽ{IU7$BF*A'щtT_! |yj@0TVT$A^X/d3|Q[FQ\ 3'{.ͨ*Xlx(Dgy*s\x厔πtXU/mlIڒ_ IH9(Y .l!™7:å! 7 ^!0\af̔ ظ#6]']|ܸh6Z]L6wK&T*+Č)$ͅ`)<8Fk096T_rS/~ƒIw}K3 ٘ggFp]Zn0ol uW$ٗfثTo+{w\WxmޞVΩ-R-tdZ>Im=VD$@~ ? B6I7T%f:ƞK=u;޹a 8 ﶒc[i'e;)Lv0i]J&[ZXʵmN)=FG;nL@I@b,yV쏲/Vds<9D6veTʻ9~4>i`&g43 _>dUI*|h׃WM/?=|av&zѿePOu(z)1J)Y:/.>Ra <\Ks&˭&w[81N+F+ߞBKډZ;WwmIFcHKhXM, LuLa"3 A`#uJJl{:mYBI K5ekפ{v!wUIFt>;r5鐞!TCe8RIlwr&&1d΀9$Yd|w K }+Hδ'Tc|z1 ,V[d(Q\a5)ʨIt E1%FGXȕ <bM8OYR ߀DՂPڟΫW?y5zvuoL%5|vrE\ *d늸/Rhi?E ʸÄj*%(܃2In 9iƔ.vx eR0JY! 2D "RJ7 91fd5/<3hjON*ڽ֚=lV~U=>:O ut)) >)3~CՅ)BΥ)Vٰ;XJȃv:gѨ\+s'$b[.vf߽0HLgD>8#S(bWBSG(=Bj9Zbx!D;,DF* JH9S h iFٚ8 ָjпo}q/f_9e+MУLbx8˯O+}ItozVƸFD#,:b95ȥR9&udH]i9e\LM(J2.xi?rNY9cLY=TbBƵYkSf;=fOokM#g;=ųp&Oe:_ ¼6[iѡ蠔n>;\z47 1ΰ JLH['yQBN\@tJ?0z w3-Ψgf̵%^&䄅L]KŗOMYU1"պ; 0dXu[]VZjTKN}՘>/Ȝ2dG.B{F< LޤO2/iē\4IJ40ZGpS HpIJWWZEUX\%q5i5z*I;vo=2>3\m&.fep3i%}Lڶ/p;zBQB@`FU \%i%yp+B1蛞Kd񇟾=q!NܖeևwRjdNb9ޤ'18-抚l& 0qNaI%f."MT)[ Qc] أzI )N}>WNgW\i)1yL_1G:߫56G8un38Ǭd4qֆ-`i_V¡3\|ʥUEmxU3ޫc<0mN(gHgU4JrLS/4PSќj@޵EkWG:ԸaqNtz;tfێ-'BT\cޖ5w$K ׿JӌIafqfga *Hk ITPox3N0h6d6{Q$^ug铮 _r:_ .؟Čޮǫmo>!FZH\+w-Iv~dgp\bF?medI+J3,_5I=-JMْd*1 G&QJk)iDEn ,/z=3ܺI~ųM*䭷 y[T{J&b˹CDGIq8V+E Lw=.b} 䮕i2^6$ߨnYv)ܤ_4O~ O)f{_He \:ǣ#AwkqԱ&Hq,"ACKEڐkւNt30GgT  (MGVKP-:}#WHUX# exgc6Mpʑ[kbdZdy%K1P(Ac :N8퐜vvG5T)&Q;2Mb϶%"\*o&)\xmF'>'P:)=;p%\<6xH,9DF'Y*O%W=Oz-S*:0 |z;Yjwh6(vpo:[qsb̺I!gqU Tڽwr;6~?{L+ELTǾ2l;,i*-m։6IꙛהeQg@yV(NǼ-NZ|\E&x=pXWT]U(MH $K9Ɇɜ6Kx$ KhJa]n'O{2=upM};.Qx.kz^^?VfD'6&gZsv[hv*?5>GGrι:N|N=(:w=iiontRNβ/YFӅGK>)sCoy2M{OϨ6": {,p36hVpDąF`[MS.0Lb$B^mbH+V!祴QM)R KI#ӔF"%( =wޔ)Ƈ,XÞE\Hs}|24~6RBZ\_#,9ΦiJu26٨davw3j*ltKjIuf$WNv$fJRXvC1jA (uAkΔYepU aoŭIezpX5"2* jk JM3HIQ)[m@۪|^R7~03)LC*ҁ};/\ 6H*e)X*fLj!w P`%> D4i}zl&o#/ ?щU24x GSZQxb丨t)Z^ \˒frgə![e:VT#ÝfQ)(C9bf5\WJNlv2'/c:Wݵ:Ox576G5`jHwB SaN>>kI)~arM͙27B\l* N1LݓLqz<'SL%Q"Rs㣶81Dups8$- "u j#3gpvO:h/4n_֙d) „cA5iR8DRNvcpdP),5]8bXTOcA[ )42'T!0$FtrKNȗȹ|,y~hri/PDQ-^j))B]'j!j^w"EGOa˪ ?xqD4C"=3CElb(Gۀ$&bPFb FMa4ˤ3F>Yzң9R/';~xR5KD騐i@ʥt0!y̰:n0`C)C2`9ii>:̕wLrc .h QQʹCOG :0U^I{$Z O}@gI@ G`0hO;$; !*DAq1% (3C\ni/7~3\Fʉ$h%[܁e}Z+ yҀ9o \b%-g߫L!LF['Lp-X5y3~P{}?ueAͤs-9Q:SUN{}_I"iZ>la 'uٟcm|Sjطrnte/{N o"oٚ,dbl(> Q#ߛ,x(ӌ ./ ]~*,ǡg&/YwgM62ԪdJy~DS1T ۫ 9ӫiTW7+ءboo;խyλƷU}pI6_{]^UsKbnװH&+3On? ް2˦(R#1yaH0L6X> 0 f0bJA&[Bd#hl]5l>jr-a8vkF0f ٰJSyssYQv`?L~;hvH]"Kw )l/mSi|gdR?wdyDd&5~{hH[Cxb |z6_al$]R(ﷅ7Uͳ~ٲU]k"J[L0V>1c($gg~3 fܔE̙H1 w BʊKsjYS\8 o|Js;-"($Q -T0 4&z ;#<Lx#tǶ6t6VN W85,edHk-ӒZO)VZœ_)GsÅwl­y^SDLS6Ootn3-:G:K:Y򔼺 1J)Ah MQZZ3a\nM432Ɖ w0!09m 8l\X [}S@fHKhP!&ҀX],CĖbf8HlԱ>6ǃMUVM&5ԥmܓV9 JǏku*]@fo<|I/媥vVwmFbBǰ_$"$_?@*W^iOr#ͩHC`'ch[_2'& ENQFM*Ni(")prE#Q"lځT ĝEwUx(b^yjȦe_8"Ƕ]˯' `taKHzQ4 I*p0 YGJ*yUJM޼ 3i=׃KJG i 8<\^?QKaQɖal]R&(Yr]m8垷Bw ^W75 % i Zi6{P!W7]őm} O5if½~IDV {g?UʧߔuVu~^Tg>nVd:l<?O3JEq]w_v^R^f8I{)B?h]KY+r{ j(gD#HR9CZj{KS'&z}z$9ؖPbLw*/t SB=5]EZXeTYc@*h@q imD }SlB5wSgUgMP470enOr5{t8+nI;>Z[o;w&evtj}J{{b>҃:qp)KT5f+FYRUZUtȬ׳g0ԄyZvyxwbGK-p'zwg̓l/NqGwt e8]p[b'Kshnvf\*~nveZB3ѬX2&.끝p"!uE@tJhsykeaB5qA$DY4cP;eSƏCX ޓJ&xߨd i.scDhq2l>?{׶FdET+a13Xl/ʲdCR݋,^D,d%)J*6Rbs2#Nxrvő|̔aUOl!2LhNs4 -n}5eO^C>&b'G6pm_}4 .F7F& VHT"`aZNuݬfP6A<7 ֤vz5vx]_x6#WucTY$tfOVvXϭGi5]Q$ I~OrI۱o&n J&O;@hmrLyb:,e=vˢ%) Pf,Q>ph $=OF͢ Yu_NuB` |L"+A:Lh.QVV9Kmv֝\SB3}[־8y-˾Fk.-C?\JKog4DF~}b5E7 L3O3&!H5XQ+!( @rW5PF,[P.AĐC)\K;*ԧ՞hm\՝øfKo%s>= a_k6e'm4:07̔dUCذ׵gma[Ͷk {Ahh#[r&^+,8䠴!5e5>ntؤEHlXpv|$;-zGjk9GyӶ;NҪG&W,dZ{>4rjjGs0 `e5_[>vCn69}טthKly(~ܭ%gmA\Ds9`<,_%jd6z^2PFs^e.aFܽ繫ܫrbXxPtkA U?r^Dbp2}EoOōt NI崎xdO=xݟKҫuA'bɺcL`;@ƍ2XS) BDR]'~P56%QFP Ay3giY+Vtv]F/ZF37uN\~sWѭot'^~ίc"WNs(.-s}B_Sվh5"XRKzFg퓶(FIBӊ1& X WqU(ygZUD=7X4t4V`3df^lyC+kφ Y]bfLИb; $8\^eFY2L" j[}UNOjSC~}N(/!Y $"QJh p(AFj!/!K0Ɛr^׼tITMud$R4I8.LL'rMtSߞ B̰r4;zQ7 "'G"XLZ%PH K,gN`I#w *< K=`o읥D.On%3N2ڶr@خ ;ogwIёzg*fe7ADݤV0p8)YFUT %◁kp)8i戤]J&N[MSS2dGqFOAv!n}cB܋D B1CRzzט s OQ0PӬA-iH Q$3]pcD+ckJ޻Y!qv&Mp'޳5 ZuF S1R,1Q }1Ja,i0b#MX 1.39 WQÆ?˻ UP8.}1޳Q(r:;J|sړᘅY:&r 5.&_?f2|H,$%A`;ٓ6^ 7rQ/K}n9x-hǐU>6?eADϪъFa#5wtrjJ A49X(fuC5K0CiMtغ>ύd7r ̥QG c̉,#dR%%9ct湹x̃;`u qhnVd8dz՘h:ԍUh}}6O_كB5 O\\zd^3N Bzɽ]JmT3q4:_zcgP7运غQPZ_Z7/Os4;pu[h{f{jrÕ?soʛ+{C1{3 [.r.pWM5o~hsJ ?Zku1[k 8!nxKW-mQ Na<{z KN~ >aYw:75vY6ÿiVA|\[8:7:P?'J_XYlT.C^? {XȧM%=NjڟUD\z/jubҞMG6~_pX瑩LwAm <0_.h4^|M]-E 6KQqDL{~O۷ k AF0N<kv(չīd_~neq4<$jzj5c@=K])f;#BΙ!JJN$ɡ6F"St 魛A'o-V;! -?QLz@\8ͩT-"tڟ5vTH#"ώHq3.۬qEf-b2 $U֣rIYc2fP6Ar~;\eBuJ7{Ϝ&1bkjCaIqKI3Re rG r 0+Q)+ȁU.ʸe %Rt0eok Vs\QuZ{f\ԶM֝øfw3͒9 $c .;lс}G螤W ažm5ۮ-SâE柲.wmͼ‚CJ+R`QXFaMZĆاSڧ&!ikRN^CrѨ`>e5kqZl=2b!ڣSSV;uU!oC/AzKG֮uH&;1mɐ-;eЏddu3_O`RCAW:XL7JpՠV&ܠ -7`Դs"]<|NgX^PkK`ACdz}P*7{װ(piTcܜEŕԐQl'Fc)c$dLGFKfhΫ̥`6̨0h`kg`W! MY0dmOwOrg|JmaL8Й@MAspec֪Y81)J5.0pKz. {O7hp]s=Se_{”=$=߫_>~V^O UVвT "Z*_jΜߊ:ϫ*2![DL˔l pAwE]/ 'u\\_Zw͝_ً-FmbSe~>>דKXYϤt# {%qzS)b#,bhq8U \`Aq\HFeI75p}R鑣뻩`+斊fMFdxy){LOmW"R) D*SԖl6|_ysI>u:#w '?qZb"BР I21!c"aQ2xËbD̵~< p$a F Iᕭ9;>;McY-㴮mϠ=Vs'- ,ytzl0_U-j8[fL+<zzR醓$$ y>dL~j9w()@J!jeRDȼ0!Mp9" υ9tҵt[6H{--ꛕӥkRM@\޿{i s<:\xRLXT "XXEakA*$QaBB1*GVKP%:}#WHUX# ,ݰu231H&xB8HS-S@1H-^ XʼÒ( rXEN:Iۧ=i ۃPw[^MKQ;Z뜻JiڰoN}1'->O!h'S'Jyu)\VBaG78EaHɄ&eZ8K=;NR/8Baq?<=Mx)ٍK 0xz拁_3*n;g}=il{es)0qh:+٘>:MESѰs|<@YNn:9LLM(J2.xq#%S86.9ùEBqIow* H-_NaO?-(1I[w˦ \gpY\&\\V5 f''+u'-ruȂ  ŵasyyajһ^9nkԭܴg7paFfZms`\ ܓ; VL:Sjϛi#$پɇۗ0vI00g{nB|,nqohQ)6? &[\f=MѸwو5)&5g|fZ)}zo901w[42O`o6T4[ GzJ ujDvұffkh*X.Pt;L  @`m!x%9QS/- " kĬP z40qGnn .6T4.ۄ= K;xh<3+Ir9ޯwyFWkeeFdؘiIJT,E !l<8crs߱-&h)4`7כ)cԚYqO8 j#CNî4ooCqXĬdW:aH0)FXRF5,J,n)$Z2l!hb$ڼQF|K{)3c>SX(炡?ri ۻY_9$QK4/\)%x>ߦcfe?}ͤbvI'1sVLQ (H9S> g)brW1P6Q@&`G,zȨ,%ZL*P*A u0 8*Exf֜O+$YQRt(P[?/ 4=! >F: I,KҌI-*_SPHDf:6|iN}D`N!D`?K0>h OiE[aZygt&Z9HD{u8ӱ؊-#!尢4JABIp3I䂤ٍ"TDwz@:.mjM0'05ѩkJG܏^[t´1aۡZC@K١4QۍL;S$徆*5qdJj* 3yg3u$YQ+"57>j# H4Pg:Cr%M:缻Q5ߕo"we/@LP`a±ʹF)")18x2(` _G KRti5 8 "I#suƕĈN.S|-<dkm񥳴vPhAri ĭ# %$jRKL:ԼX# !ϻ(;4cBz$-sL/ơ8MF[OpU%Xy3qPe8X *h՛img_s[rjCO {ڈLR.'[iï0~KlҟBI|'#_uy2Mc򼅃ᴈ j =\ x$/ tk$LyE s'UIaիU84>HpwtP=5ϫ@8AS1딾۩T9{iT5a|rwbOM.^\>|~|n5-j7# Yx_̷ {gR}= k$.U&wgf0~aUu5QZkb|}M_ڪ!0Zd"`X`DU:{_zM#hsNW5:ILFR\~.~]z4bv_bX[PC+=XQ=(tߛ@ML{w2&~WcIGŪU|CW] ۅ@HۏN޿ۓ}N0Q'ӇN4$&Hv&z(yܨ~ڹjHYT]V9{ b-R/p#q D%m}ݥt k"J{*&Uc,JW=4bƁlj;8o]tG>ܻ9c2qkyrOܭ;w8GOS݁7{ɄBQe1\L"eeV:JxT#swY6}_l(N_|= b L *(9|wum{gBdRV'R᱌&&IФ8+_6qK ,f쌱''a'>m!)YN0}I%RMJlafu랪*wJ}M(4(]znC}!C9{@p#pø>;Z<LfZYΥ=r{P|[_w×k+Л:|a Fņz~5}URqqzdPȜvbpZuIfUXc! wQ/d wڸyϻ=vl\Xuڤ.xǜ f2xRB]0 u a2 GI3A–^B{:pQJ?nZ݂J mҭl/_sy̙J&*Uy횈['6BR6'@a[X1cO|Q@ ЛpQY~qJ ˜O \\1hXڍ~wG`0R?}ٰpB+(3JPm,]p9>NS.AMZJoel/7?Nyށ>MncT=M+$ ÉQYø-Jl}<^ ?q?~tZx4y>rގx> G0:\TJp7K,=]F&Mt-:V:q ::fܡ7ˤt^EFga[}QesF-QdM&qFY+ 1"'.yM޹"[.x}ݽÙ>3kܜ2q8߻àKǠe޹nl1Ҧí7Nv.EOXképjaT;zpCtqdczt4']k?֥(}<0[1QbNpKhT/n4]W>#ʈ_ޙ޻s'iԻrc N{7D3z;;@ٲ$o$2HG=W3^o<؎ hf8vـ'1 CMC߈!*cڕ Q=|#><4yĚ2Bi3X՛>eZe:ܚg&/y&rJvzeZ=Z3Z&fYkbHAM":!vXpDvh3K YK\^k^(_yF9h6wTSkճf> tR8=dM6$T+HZc9cJVu0uA3!TsQBSPj`:ubmFEgaG8標_W7nrk%#X 瞁={>3![kcsФABfID}5nD\R\S&9;C,झ!e912 +P#i.b H|y">O'˜TtEi\ dvV}rԡHM"=5憏$R)K/>β`8вx. #(X9D^J,K#P#O94U9vpBDNQRv"dt_J3uPiHĘ= |cmLBVN $Ru_kCPA4 G-W)FUNF3DO]{DȌIx%N-d+K0'QEIˊE"4LlSJa{jp1EOs7QnypPxFPi|dffI] 1,pEa4/nϳzprlA=O JNX0 ޔO"\۝o{6!~3 If{ X@qgI<9z#&΃n~~ M(56y=`׬Dn`{<oGpp}qm 3IL[qSr7_fǛJ⑹u }:D泙F7L=3iLROo{Lzqxrecۅ3a3z=|xQd-O'unyK_m 4.wM~6 ? ,(kZ&Fk e:<~ئ3a174/ԍ |m$]& e1mfа0&8s31{$TKxa nTW\OQ-ܔᨶ}y.]M:9D4Xv:Mx9d~b9y}ρm4GӫF!kuf%ەx|>Wwg%[kki.-vRv{oџ_N 7eNsG/=v RRf۵iAh=% %PhJucuPV{^FO..ӤTiNFu`B5Vdߝ+t<2F[R>>ehX}ĊyOE~(c(Rh:nZ}/opol#ru)F;!6ղ+!6lCl+|b`-DmrHFOOӰT<\p9f@k.cbmLr~oBJk,-;|#{4ZbiA&(7QIi/$Mw쎗IB1mgs]KcKkLi ؄/D/ie\)\ˁ5+l+()ݻR;JI%F"\ C L܇ק%"F5_$Fk45st '$cRf=7H[Ey)-zwԛ2nQm]~T(b4DY+Bݡ+,cCt5 ]!\Bv 0tv>OIΧܫ}j9Z=]-n<]=] D g+dW *t(枮v+u}Wζ՝-%| Qr]+X1l9wOp }q7]B#m#`TG^fBeksu1!5 ýIk#VA!'*S% az(B(K&2 (,ut:rMG *՛*`;BY+r0jnY|ׁǬL>cFk4ai)%F >_&S[b 9Ťsl#4MzRIZmRCHq_9#HP@g-(RG RKM:Z.VnTnj^(!,},tt}uGXE[- 1  !1sirm<Ec:Wƪ&!yi (P*I&7FxE' -dϑA1woAhhojXp0L/7UyIJd{*ۡxw=֕j(4b`!l߹"hA.PP]9˂@Ia7eYa5t1Q 0'{4-ffmyG8]kHEq*${lê??;BU]ű~~=:]#fxq?>3!> 9geX Q A#&46@feDSTQtΨ0HiԢTT^+#ǫeZ6+ |} Z E1E+(ZܥI |;'ӛkntoVn,Cc-"q$✩ȘXHӉU]u` ,S)M5\_FӼӕw6v(hj8C,k陃yw7#֐w,ыk3RqNn הKZRCܓa^nvGʑ6TьPK 'b灡6:2L2 ѫP+P$7/}z\s2"9gWb_dpC Nr V9CĦ]ش߉J r>VM7O|r (Dd{yQ,R(0`0_yPgAކIaT yi'wDy- dӂ.]R 6=%N&O8;)]j/3If*,&C**iJs=U d ՚wg= YߙUT1Y=ب Ȇ*|*L%d' ry$c@"Uk[ő*^MӗkL K5C+ 6R3HLE"2!yDcPGU톟2V%J-QS-/62  W_Wr–eє 8kݼkdR/p;s&V 5wA~U/ ɪGeId:xzM{>FcڣTaDo#N˾)ԍG:HКXn~ct'6ߙ?KTw\7~zF S&[Q (\Vi̾Os,- Y F`4JGvwV'mgK n.M~9GG ]M`Ӡʦl6yX_0՜!mmCe:`5$@Vޕù-wa^gy.4p@"ejtL47 6w$=+C5FNM<%jf˝/Z CʨV .WpX8q0M#ILgo(LNk_ KQ{ZR>kTl@$$DITH4µ)^֌K\CvP|%YiUb!aQ2 F 햊Z8!:!ʎ٫@ZHp=,ii0%c!G m'Z$v6l9"-T++MrJ ʏwL&(M<!|*J!]_ YR> (', e$4B!H?Դc])5v,Іk(}mb@T` M :t {BU@Y'z5L6\LJ\87eLE܏AJ&ؕ !)#uHNTc՚|M0,ӓ\sپlӉaT NYڷD"І|D"IT9"I%T97IYnߦHCYB3F>$pKv<_2&*˜+0qJQl#Cx< x;5Go8o柡d ;iA%#r%Zd>DΞ0 $Oa$*=m5뼜uE\6m+d<΃ }_q'6Z*?FtGAv"MIя " 0DͿ;'nT@w+2+zRYC)Ƶ0aGb5?i=} xGC+)*-ɨռ^6[3 %8> m6# `P\JZ 0Xe1:yv2 d2MK;.'w[<X?'ÇQؙG뫢Wc<HҏWKl{=,57LDaIdDQ"VlDw] S@1ɱn k>̟[* 6U6G01'HDX,ΒcwJL.C7gz&m8&'p:4'93l'&REu 8aU1(;þ@iLu1c)HƈF2c SFџ=3 Hkbc$:ouE(v#0x8a9yⶂS,4x"9<Ǯ|^ ;ibf>c(jg^y g!a4t{Kv_<̜qXǟzy1e֡IllqH3 bawYk~}|[mrڢ'\Il\~N#c.2`T0qSUt!Շg/L>퇩~ Cqia f/DIdjhtb@Oql>ˀL6-VI~Ӷy4%v0;whEFƄEڱ(V@,x%^F}WʹB2[r.[jP Lظ=8ˈa..&6j*Kȑ^rQG*^7x;Y~)&`ljyTqp6M 2w~.=1Qڳ̻0ۙSonٙv?}|wfqGZIuA}fL+#F%+lQ)#-h Q]gpu&<\v%p1,{N Wr;|?+I-{}x!Nj4V̓g|:3~OTbw>$웿ہ9F?f#˥<0;t2og{TPz}ئ3ͥ4YpyOgך%eBmQ3sى6}OjrT)U Y3/'kvQm Ռ\lIv/Ztw?o=Ӵ lr@2ѭ$SϢ́x&Vjs՜Ii3/|"1n5Mw.әn3>S]If-Ժ.Oj=Vh{sqZx)5tnwFP`s&4c'7k|!;(y&c]=u#HRpvzVZ!M74|L s(9JwKWAhW9XV_Fw߲0"%UC"w -tPl ԇϾ, ~ДTѺJTnHgF@CVp7 d &sk)C~;Ȝܸr _5GȔM8ȴ>QY5}m*i nx(e7>~Mv;O8%;:Tɚ쵳St>Yi= ugu~gT6h7b%ay<  >VaFNn m0ݦ^M#Ʒ/K&n7}xs߶S#q+,"IqΣ,FEQlPb K$d0^oĵ6h]o8~W=AqY vQ8Ǔ{)IĊ)=f1;'#}?0jܾi\5o?㗟ǟβݬgz>|=?o#Oڛb&I6IT]Rr@- g"MF$ISaH30,$i.3S)k~ٹUj 0>,LPbjWNAH Gs԰2LZc/4D'7lkxd}yHxhn+,BBWw>Ⱥa@~<# s 0+wʬjX0BE?M STUɶ2na`,*t>^5$&p&(Tg[(t읿maK dP0֥7}qBaKpGKtC@W60%haˮl2]2잠kYD}yLBgcaJҿ$= s KЁ8 T(?Sc"z ,Q"yTVt92D!TJ&1=t0T_{b; M&&'*Jcru`?jՓ(R.% עRo<~Ŷ@8SE{`NpBvFA8=k`1TsCso mU#n4m HU7sZCփ;BR [Aj;E/& ,>><ɪy3kc7ݽaHk;lǪ5!@0IR(I2 2IԢ9XeVDe3d~,C/vܡlksb>wHse|v 8ćl3~@GK^^)|O! 7=*{ܜjh<4ӓ)^ϱ_)5%,~o7M?q܍>I'pvp\uF>X ̟!x`QPqLӂ9! PjY}'%IsN 4Ok6}Ĕ@P)@Ҋ+ٔ=. iv)-r7N?=4) 5TgQ)8eV=L]u\Jǁ K F!7%kc^: p?qL(0zFd)3Q+J ScЪ.ZǪh-v67|mK{XT&Scx<)'9G!k, RUI_/5xy7[|MYi=\)r=rglIyocS8_'fx4%drYǖϞz܎M*btRRKw閴qF0{ENvsKWkY@:%qjY?TŨ"ϻԱWJfy>ӏ½ |Ϫ՝ EElYF!}E1޸^H[I3E܀ZOlxh,Cֽ+[kG[ 2g *y\$vc刈8bWݩ{%}h g c ݵ[rK7aɩ&~ע&H"o΀'SgE=kYX Jxe5Oj̲{B .q~4>~_vľx)ɖN%U -sO_cinm6OsksLx饾Xl##v']%6%f7˸W3+ 9nR]oU5kZ߽L9y;}um]°ؙ9ldvy#>EmNh{C7-QPԬC< 'YUh/6[~1KT>yPғ(JxpZQ;nYy!~V(̧*@ڛWm} y*p}8!c[ϑe3g6݋Bsh ü01[ď {~'^#51T=Jotv+0QW8q?.hjbT_#Yq5H0{wIB(s+ d}t  $q/G$8L i*)ʥvB r f%ւ<>.o|O,!esJRTQnf$Eλ-6tCUj 0מȀ!zްRAh0ӇI1MLi7+t=ʞ+)6ˠE (qz jN//r] ̦2ڪJRiPs, *d5x( 1$G+޼TIsS[PG‹,t4Ep"ݡ ً``+3-kԬ ]>]+½yn a3y<-gKjO͛/A5gW@HCg̤\e,O9΍ |~w2,aH- 6wqO!()5ۭ2>jk+ 8vG:յiU(Ũ/A> Z;Dfr_Ǟz+D%f&dT$I!g ڃSB!$(#K{8棴|8$LJS9 R(+ !j6u }?1E.=RV"fu DֽHHk)ɍ>D lj|ՁhX:*uݺzr'>ңQ_ J1j2-Qoс$ D{#i@5\ O\{L*ܦv?EmICVr-R"Yhiz64{%QbN"CBex@5Eq\@W~W:e,{͉(7ۥK/މ D:VDbt W1| N[9D 9-q@{p b(5`8Èx^HJ5>f8m w=Hʛ+4=.؁,!jFfS9T2"KN(EYyfAW!tN Ra/RKBBjrŠrfL0lIdT9&;?3Y|)ssW\7',AޯumqG`F>t14@Pi%J 0PMpDz.^Mo km<{#׷7!6bdѺLJ E38 #@5N%&9N3ZGi>LkŨ`uظ;.kn;́BNG5qXL,m,ٻ޶r$W f~ 0 b{fQ,_")F-#YyD#I^ڒ,XUKndGBLX |T-h,l{wQJ ="4*]ΊgQ #n?UC%?iq*5sJq@obRS$hV^fq(Klpˈ<՝QJ4>:~)Nfe )ya nۚbFy.sEM2MWghwIƦo3mêX*/УFJ5Ct)mu8G7 * Y}[>wc-ǔpք.P#l>|( 9j*&M }YɆ#Āy+0Ҍ|袚IsڨP-kfTmһKnhlyK{HUYq-E݊)QG5͢r9j/C-Pucj2<98|ތsnzV<+T͒7Q#I;E9Ko0 {Vj UyVj~B-V,N*p%Ɨqz;>. >o >]֭I7󊏧JP_YW\}ϛMATTkp[쵠HX \LRR'>8e%T%|j~ ZPbu7 hC9ʢ8w8d#t:ϑV9+I Q?5LK.}Pt@#`\cYDj#^NM& J&nk( @5 L\6cXTv_JwUrYpy`ns[MM'P(ʍb4V.9<纜\fKSyRτLȹNRÏ$Eíג-KEByp8'1NRTKJy%~іu(\zh3)-X `X3z]@QbLP.{a|Os-8Ϗr}s>~ Z muj9BS Z)Ul:b|o?yWl_R2oN-6b[˛%%[Z{}VmRX:ŭkIធ}}W?Lpfa~eF Q]?#%EWsF aJ;|H=.I/c ET(V'c)gaQK8K Qj&S|`-Pp-iMbTHz,H428'ny#WmDqCk#wnIA$T"L J_H=418pQOqg\U;yCvxs&*WЊMXn1|dr=M_uկ>-7I%b\d%w\J&c XYNAfI ¬CA9%6(WeX,)X1Eh* [ pGHdK[M Dj2*5.@p`I6K n†18ڼ[lU M'hm]aP JZ?Gr"'c-(#=tڵc* 8RZטWǃY<ҙtjz SG{X;jlYE~5WۋZ ]'D + aY;>EkS~L0Z5uХq36W+̕A?5Yr`J(֖!i2ƪ7lx"b8xtIυPl,!:SbcG jiaOhQR!2j"4 O\m)㴔薚i%؊egX2{k_OGȎF7#8UGbƎ]{ٵN<1pڵJޖ')10dHXG4uI27ף&Hq岳G4F7[;&ɜRrnl3MGjlх%YdJg\@V-8aԉ-GjNisȍj$ӒKͅnT%w&rLu-N&4բsz FM' "?񹿹?؊̭*ߦӬ9ηd_W xu;p|f"aE"CeS) ѴX{?:h8{`|81mG˰Xi2ѣ#NfzF)Q_!! S-EX`3=5|=͓n?FhZcEޖ=D^E+.a1) XnBچk(>L6K"vFYj ~KwЯ+Ep"vۼ29QLR $ljBNU$)-0D=[ ثbrBraWEx|hn vV<*lZ@p/B2^5ي(:qy['T$7~e*ET{v>0&GVAu*1F.bM7w͌=PFt'N #"#!gd8 JWNr8c*a*kIu!O"p2yVF':PyuY9׿ j֛pP( #K~M.19&WsY_ ,z\1ykaƨÿ?~p׿F̟:}Vo`/o 9Lq1i.j6w?w,w%c<󻁝x6MoFPHsV.*٧A2xnr.zV<n`8rbPCXINP3t7p6@SRYgQ#5MkuɟcE:~D8>n/Rx8&Z)\ۜcQ$rrW %;XdtKp|m ^2ﮭ~V8:y)}lŹl Ug"2RZ$ ++֝#Y_o ΀vjie ;wֆlx`'$MZ  8ԇaK'pѼ Fߊ1RTT E[  ζ"e"sׯ"V"|v\8ks1MJ2"\))FĚ[BzsS;haH x&8| JH_XZOW7-CmI2A&*P`(bU)! O{1jb< 1дJ[#e:H S/4XlH:@zy D I3$sIc,.KC$IT Ev8IѢ晷.  KAP+B"SV`\]1ͳ?Yb S(a 7ZsYqn7NsOn[0,ӄ)C@3"# ݖvD'}j܍ggU$/8_ ]Ө./>g/:dW6ۏG3eeZ'>Wa*"^l I $ O#1EQ4ft|H# nS2|[^u@oG/wǣ,:N,\^y&E1$ûOcNAB`C4Q ]S3Ok,hHjZ&Cj@YB&k-̯ m c:WKS=ʐC.o'\gzF_!0%@?vgMyŲ%n"%۔DZ$Jt?turY\2^iGފ~?@n;9Q]9v3gxVv:_h,hC_/x8Im1dnyW踏8Cu;xHq㝵{^|dqIфG|fѽ>R>d>~%W|e^r-,՞,B5P8Z8e4-C4f~ seԌY1eRiqEF=_*ә+ZV׵s5 m}Ƥu ]j(BjCrA]8lӇə55Kg3З:f_fEٵYǒSngf7" >HFSTrN(G6'@k7Z^T1P-ojj8%*a-ֽf xA=׾a=+nB BP_+ B?"}"Źm`zN@YRQ i!Zچj$)01d/ghQ4R; tͅl&6OD 8pZ\ϾiRr$i=N6ɜXBFH͘TuPpkPc**Hb鐘 6*4o1SC97 }vRyeŅ.VNiⓥioZmiVC`8CuNq/HN>djH6Ȅywv;nX辰f?0Ns{.7ftoz}뫹AU!',fdAcspXR5 Opp+nZ#BERhGs=Guy9`hqy^ܓ'@!h L}j5YdөV2"ހ99V|s bL{^{U'⅟}5*8dOzGjN#6SSsȞf!ƚk3ɴl6I^Z o_眧a6y"yOn˺r@i5O?f 3*lqwnڧ=RYNIbkY!, d\F{K\S[[k4Rj5mYWg8C?lQiIY"c,IP >EISԟJx4ZmO^2{#kॆdx\ў7r D.*?d',lKFkєA0q:hbwx,Lݍ+}JuyCY/\B[f$} EMM05&ygQYe,˻`$ WN#.iPLIA:%ivxY-<ҋ||l9ݽA( o0Qup ?B՘ޔ "zOEQcu`/{KύZEҴO}.PtWeGOvm41L7ȸKf4Ai!mmX9$$U&V8MRYGqǩk1?Tò lnAvQыPAOhq >ŝ݂i`JiI4E)r*a҃ΥΗ']ź.z^Z{mZ<0BX\/v8.>Y Å;vg3346'ŝ^f"8/9k ~v[Am7@uW -"zBk< O_"UÝ1kd\!K< PKiGgAT |UJ:EFgQ YtA.hi5 2IYUnS%!#YZu@Fz5auwnRuu'^+viYPvĢNRnġl u"nP}yuάAn8S ԝ'9A:KrSofh:#=:<6PP bt-# z<v l$P]٬}HaG 9̤զ`kzxVqL;, }\Y]D앱VƕwHLT!:&ad5P*an Y2eY"S%d(y5i fMP,dS":8e^?0rg9\_1V_3{$_;]3yBG@Q͍=wS`^UTA%^p坠}9ÍCxlq#P]̋z)iԖ:qFծ ~ c3k F!W) q()C5v&$̀ ճ .ôM£uœEl={GPFC rC  ix] :q┳ږRd%ڟݮr׌NpB〖P ;MdC3/nc3tw')(Q$1smAj^PJߖyM3庀xR&Dgq!m`ʼnò0Rh{j_2R?Z9Y@8-0CrAأ;n PVIP>iA$}%;<~}= \R4L>>ꎯǩ6[T#);$g A#}"e#m#׳;R󪍶OZu1aj 6o Ե-xw;n)`oktoj4,-h2MMO²鍙f~|?f5,l:tѣhsp:I39v ŒJ1̐TI g2dKQvϡڲsT[CF|3&Naa6܏A[*變&{9@S & cpHb 2tYQ,.USW*X~"I Ptw {1 ~y z랸_K.9dc/"$q!@d\HGP͈87ڿ7C3uu.E%E1Q5ȸ*0:c2,G.Y~,~E9#}M@n~ʼBg i^J: <_d)N`n Y|4nQګg!ӯ9my2`VAZVczʂFnAHnrb|\+=oL^B#*gݟ/CbR0u."Zo16?n |VU]`fE%/jbz^ N*;`?baMl3ԮU-]{#{V`b#vq,& ;f܊8Я; ecMxX.bhfdvKVGoا压5ogEXE=o{#y[@ǛCNh+=fZI<@DaDQ%ߚAJk3,7ʻ"kxQ'KbCpd ދg#CmPٙT:P莂ݜiC9zqDIۛd( -P۷PEf..[7#΁[si5Wai & ?}3}=8U34o-GAJFNxεnݭ\2_+ܿ-yvv ,BF(g3iX$nDV"NpU;V [Tݛ^LИחpnRkΊ*f^0On杰ݟa6yU"ݖu48PtCy/B]WӻC9N&Tb5D*oAtGG,HmS/۝4;cmRnSbqL j"AK(AlmcBn7Ά2mP@K;+®SM]Dխ$%zmžlӇə1@CflL@_EIϜ2PR14!eWmN>:gm~&òw }cq5ERCQG;AlOd?KA@fzIã@^Y&e.\+kAi5_VATѣ2lycҎvbAgRm&IxYfT!:HT) #kJD ,lpUnL'i*T&2N44hq~&GKiIlLTB菺nZg?+L"h/M[)IY ҵy6w@XZ.L_8>g?6WVzlBsf篘!Z$sok^Wxa|7f<6ڏ3E}U;2JW%YkE^ G0$ S1:Ȱa.DJHM Y$d7#_!ƀS=#d96 DE->ͭ؛bSo]鐙M*hEJ{WO׼95Uu͓{@ _3?2Ӽ0f_1^C#fsLЯG{G3ݥWHw3y^z[]\ʔ"c-ئu  V&ֻrmD< -$F'E"\j}qB%XA#R-igC=VjťNt* MmRrWM`O a=yc+@XΉM`J+66oqhod[ Zc}ua(JqC{''|lϨ  F ^EԳij/lb[F ۞|ywz0O؏ޣ/ekԗtb"jhJؔ=6eo{ eck` ˿>\=YZ074zI =D -f=B1JKD>m=k` u@Y2p'ܚ93;,߿뾔k1[|fŠ/s6:Yvz&x0A-G}d%񠸟<2ۿtvu7o7 ֡KdNcw jJY9{SL)3q@5CA cc=kgYQ%mc(Q`.>hC`FFbqJ̖: BCh81्V!1'OÔB$@GU$O;*J.C)'r(zixY1yˇQuZ!_jPj;v}]TPF }e.~=ޢ:50>Y#BcG7Y}uV﫳}6]"~i\x 4=f3Dw(ߕo~b^] I/$Xr1[EN{NXT-ɣtn# "HYB!P,PLh8σoEk+)%(ya `]r5)yFsHJFIjnis[@veD`VOe'3C_]4QRu}-B!& ud] gd0JjS@9[. /BiKO A{!|rJX.nEާaoi>2iQlvpPXZL%3< blUh7~y85[;:2>M eAԐ f& MmvPI05=Z-]%j<ĭ'b6ms#B kQ;!,e G5֢V7D1֢y-rPtzSoCcr|EM}-ۻwnM+%0OaVQ]7b.dkw(rmՖְSsќ4fOYBF?cOYI}Xǿ֙ԇu&as&)˙g͋M+R=IQl[/b#犾P).Z|Ļ5NJB=5z6tO{!^|mFzYXݳVk3Yof[LV+?Μ‘ń|\dk`,*º'uvn"4t@usW@ k>%#т8. 댵sqb+TRDaDZ IX!*= ƂScJz~<=\j2NsRs ^a-K6UU;VB͛xV]ɻuk`66Y&]_v:A u|sE迫uwdxOxWt`0*dWN^+cEwCXއiA o;[Õv:ێ$Ql{"6"(3&[NMXÊd'rLɶ hG DFf#C$?ha#Dh@A::$'!y!?{fQQ%-AR(A&/ 1i*ņt)&vHvԩC+-Cd%2q#=;r&w$@,N 'N'FOI%I9#/X)"D댍P' =wʖ\%(1ν.բg^tkUrs_P~C)tNۭymsHaDۺ$~opJ;kϓ2Rtbns˾ ֊q W}RP#v[*Kf`ڞH&mUM+]ԲQJ.v^:-& q.YpX-) p 7*Y@^ø`دΏYx%*?p\IJȿ;%2z.T2d^a}uf ǤJ)HTQYx:3a+_@! LߚQF;vNNmu+f f֣ϦXu$JAk#Ђ})W^. ɇm`#=oމNeN]/W5>k<J+޺Fh\c' H9:aԝ~F1A6&65Iƨu#?{:p`1~:E_۝s1(c}aV:!f.AAN ]^_`2K >]xK*DnEos:Z\Rz,ic1ɱª H&[r9y2s{QD NKVQ\5QvyiD>6Ss$˒In%6FIYh$tl@$\Ҫ)HC‚& JYv=zz'7՘K ;q\bL5  bU(:32EƖ^U, ^1,DP$Jrb(4Fg A5CC76Bm ^,E (xr@T~VSN{ٵ" Y i+U d缗wAvIK/%sr|bl*o`Qrn;FG/@va"'/.EdX*xl7 =Epb((<|9GA,OFE:F) !FdP Q XFC#{t;|K>S܌Mful2cY̚c0EA0ůM/{W JKEd4;@/vاknN:t6~)Ǘr)i>A:s|.E~Hkx18P":_Z f"A5hRkRمxO^P9yc{JsG*RԸgBّbe%Yk=t˛JoEŌBǹiű3!fjȝc|f?) 3•D5E%O;\ᙫvU="{UE׸%U+c`v;D'I&1>n6R̵RLZGN4Ҡ޵)ʽV%VM L^ǰC|뛓~rߎM;5iyΞ7ee~8~6/! m=iBzk̦o.߼ǫGN?? ߼O~w8}~Οَ5GFsZ BϾo}k510ӍosR@%[4炧OAGnАkGCb|4=x5їo^bVZe2fۃy#ުO-ko2tuϘ4̶ AHHX{ |nT:SѠfŖJWG$7 r|ݼX{,.=(sflͮH##V4 L4T aj&6l@j2q0q+@\ӨfýD8Rke.6ȽˋI47[:inu^fa%2GVÜs)_4 raCܖh1\iZp$?Z+}7@mxj>2-|=ι`Q{C"i1{/ ?^\d7_icwd/ڼ;J:ӭہc-54>֙s1' W@fhb ccƑM|XMhܦ733q[$5/Ib45 [77ܟmDy 3 ]^݌/,{؆4.ILQKI*%V6,5BKD<9.?6Qw E/F UG|Zg E^eߒk'}a<6J;fG7?YmV+9oa/CxJho wy|X+k\3kt.QμS7K;FmMHki^4:τ{cRg< žPIlo~.}Q 4z%)t'vXIL{é1B[W|!W޵xF&;= bmF!k->JecHշh%+7!ɲEjbS\^|-ٵ8B6GbMq`BEBNmlO2`kذQBҜv20͖3nry{/xڻ_?ȟ?G[t3Gt{m[\s:93͡FgJ\4CHe8JN-̕0PSmn,Nl"AJ[f@Mimy0^/(x 8d`hPoZ E„q*T cN|T3\h#n~.jw7ZTh}L9, 'ܞ} Ŏ@I_^|ۆON1;>tC^z{C￝ֆqIn c{͒Kq03:4 w9wK &@"52rBsSކqƥ7t(k@T#!igaq4aapT.۸tOȸlZqCu72+pݬ4-.7 i׭j8!Њᐟ(\&/@z<AYev+idv?? 3j#O 16 la;Uڭjm*l@{(etL/ߛI9ot~}ewa#' 2'F-loiKMShmǮЄ^уai#ؐk&Qg 44;/mtmKiQ!ȿjFlfhAͲ$PYQ:{pI5{l-lc-P8s֞[S Tx#%>=Vμr-Y}?'ga$8so(y0yW> 4'gOJ@rgĭ}Mߨ]?k<\qvΣY3&*3ie>&x㵏sI&dX kCY|) &e"uc]af{ݦ.t>:hg<Kd_1HnB%D0n\SR~5F f oeGJ L֔uy:7?n }ۑK bm>H.L~<=ٷ(˵gBY(qMOJk+k)(%s R)ϽI%Fر1(+]$s>`u YyQ8 |Pdªh/gkT`H r+٭r5<mH2Dj*'$_;*6ڨfj1(%3/1t6{% Zq%]fr"RJՇ.4邤afB#' {f6d}B \FgVҝ~YW&^ѷZshDgAEdQ>Vru0Ӳ!rӉ {Jad#ODflJ(s-o["6)]X$>"M$%Ii E,- ڀC6L1PBo2/k0J.[](RNO֜HJnLc'`BCܞ( ;eXLątg^ `Lͽ>z󕶥+!B:8FJB֖mNto(5NS- Ԙ]-]R4`h*6"W^89`&QGfXX<'{xSfr.״KFcx{jl^c<^u&잝hخ,wY$rc:kB4k@L{ Mg<7>%df_88xi!*peF}5 GԀ/=(%{d3Gt ­A,ycG'%#|Im{cNm\vbf\F*a(R-d"3v XoZaOB(L`<'c%i\r'P70<,c>9z| 8; BH^v!BcpS6 DޥUi>W6NW⯘FbL;_d`5& pc:kW@xfcy?WGӵmf ȢJ}au}p6v]HfV蝯٤8GY/}}uKƌ~LCia$16\CfP. 1'0,e.TX&@)uh'-e|ZtS65O2OūwΜjJ^@ݻ/y[Ys&b@tKb_pa/2R%0" ̕Vk}as8lj5~/~P2 6+_ ?=8M웡SG_?NsL ?͵2͈{*3u>s6X"$0OzE$`kY .B8:W?&P9Qc4e '*3rlgjP& ErqLSs+4 ِ`m 9l_m&" G.|Kkdm \Rӌ4.ièg9wmqHW -` .08蝷WX_e]JTTa[Nee'N\1RϥRF [סjM(WnJj3K"XzHoD Z4/Gh7z!0-kN$F]widmM!f ]ĶC&D%L6T-3Va\ZPED|j+\X2{}{7]q:'%ᰣ>pJ=QN2L %rF]H4>7(zfLG';4['n^ޭaīKIXQ~3:x/-3dz:w^X[5 YVU:蒳$MD={uB63)ZvcI?!,XzJ)1r_F3%r@OR_H'V:97'ξ:MKsmifGl*H] ;S)uޕo&tؖSn~ KŖjg钊 hV %P"!ybao+EP#} ៭U۲+W{s* eZuC[1g^h؉_g_^_+7#ڏ “[}*zyǏU9;|/)Ÿś.;M:UPoS_~Pg?πc҃K 0&sZ.c!XN_0GN~~kW's.;vNdx;e }|yͣv@fj_-uv9qODpg\t/yч82K^lmȋ!r0K^tɋNgYxze3M 앫|uU/\(ONZwa <ީ)bMʌO{7oaROǹ& oc ^7<۲{߸U~nqW<-^IEOxGuQ}ڮ#SnK+2hTJB&՘C9k݀|z|{Wz嵪[EA_˷Kzuݚ-uuBY®>:ZYyJa8FxMvcKu }9ҽ}za͢ڙkEIX$iUw*6M U8id.{']57窀$5l=lORR69FFv6TNOBCƷu<bcDc:-֮y4Q I((Plr] =+si?ߪk&#~؄'?}i~N0ߦg|]ap/=N99: /;f|? E/+r9||}֒ %"5)n̨Yѝf]mgci:gO]bK%+wZ7K''mnċkɁF&"ZÈa7U'[0߳N^ $, Qŵ-މL:Lt3q,/ %a1o4-a_< 470Xk@^ Kt M<~?ޖ=FF(7!KW9ı.: 3nv(dw< gG݂M/I(\kLi #P\,iNv0^ʟ }?osQ6F)C7OVh+caAR.osKߩbMQb3ll25dLkW8*>-D%asӘBit_FvL9Y _oOEIȟʪeS=TYld:&ִޒerW2OR{(c1RB dNTڙp:~r/ ϯ[$2$ZhuZxd jN,-SS0CMyH-ll)sD n{/%3=65'#^Y։s?`7s%ɛf|Zq9g:z arjjԷfK. Wt*>NETJإ;~~y>3TFX/dK9}> ];~d^ @uKU[bTPG1YH\t:DT Rhݏ Ikݧ%v!;^Sslg5.sb-5q̲/o$JDkѩ'k]OR)'! 9>GH/2;Lg9|9bQEj7E"X_6Q)54R=) H`|ߩQ6*X*\)YЅ+Md*V({+qp#><̞ӧYCk'Qq"#{{H]nqƋ9;>]^h4VFO8JʖuҮ$y_$͑]>6b.N`8oRaÜnPf]V@%4ܳťa)z7kUDKïY=A"y^V-5ېϰgK ^Z/GUNϛӱk)S>DOJn3}٩HNT]Es1x[MVVcLg1駯_JS.g%Q?6{#ԓZI#g8;uJ1un!ϨerɸK-.p{/bb$Mq' Y4D.y4l@EwR7x59!8YruP-lg.ؑ P _s*IH ~?᜜5iq!5dj&pĦFuLɾ48Kop^4嵿/M&IYA*u+ YfV\WSLU`T(f-H04%EƠ-{>K#3}a^l/&ڳy})GQ9dkT. TOU/:XɵEB3?h({nanU&1yv*toI߈na#,2(/ȞeYLs7}! Okۺ_^hNnL#~P{J$'y|p zt_2 $le FxmL hxڑ]R KJa n'n U'eۊ/y<Äc La+I޼gͰY*,cO*YeƎ؛BqڃlՒ}woT Z^^hlV{?6Q?{pÌqq @gBv~Х*L\-~A\%S9PoGoDZTb i޹)_[^Ib%C^}hbodm "?KlQǿԙ#}ꡡgov#>Cý!:ٯ[ڌ6Si(ݔÅ՗֥v@/Ythlx˖tCgHRR$]ɎIPbFa*`}֨/J*z~̩\(2HT.@S`]?Ǒ"p$`g=5$eAq-ka[c*QHl2u=4cI"a:%:oLHIsq&5cH N*)XK-7gs2jB&Zcrsy#H0:ǤQYjq:ǵ gkEu^\4(zʢǛ*v;a}ӫ!}xU @Ϥ]L5;㐐aA9b@"rd]@^}DZc`,0 y,/.g`xf%ɕM%>#ECRz+"Hcm>捶K 0bnl#D#20FM8̜3DF3[ 59D6 Ҳ:;'@osTnchlv9򾤷m86g2p?]G6Kz{ʒ~m">v7L01%A  $ˏ57:tH-`Z8eB;Ֆ0jd8$;-{4M9r'[ݺ(IUՔ~Lȧ+M ߟJr?߻` LF7㗿ڗ<%vdJG7WIՖBj M#)41)>NUA1AK%$=@0 kcsӏ'u [So> 7Ca_V^y1(6J ƥ"ba(cF|aNG0rVICѫCM98jķu3qyWG`#@Qe"QLM0]qԋUKݧYټ{ѨjjEwSScQ#&eyAtf7m'vQpjʸ?$ =GU@ڒ *8B[Ƥf`E~6#yNz^:z?H664&lŃVݦs@"}'9qj̣A Q2^s2g7Z7<#(84sH|}5ZA!1c 5{`GU@#sAy:E+JynXml!cu}WhMS*,y`flW]:z, >h; рw=~ jeD#!7fc('BNCK*i)!4Ƃ!mGXē'Co_55+hΤܬua4 8dMg8ENf-yCff\m;y|0Rhy;5&19FU.{=|:z`C{>oZK RRbȐyAŮ=$iPCfU-G!C Ʊ|v%Q_(w2yeEt x k)ܥ>]с>HO 퍣E[!eW@ ʓ0L=Ưy$qFS\6^ۚQԩܾ'SpsVVaՠҡ ݎHFڢ T@-)7p KnS!DOYeW(:*s/H&j˛P q]Ew7U2d#1hn;<c͏tN9\Kj]㾸5gS&*Ȧ#xOi|eɉ{⪍ܷ͘-~o gH{ˏg|n0faW=(iMھ1Ԓ%>CPz8hVF&IGDIJΛTǜ'(=Ҝxڎ+^G|{v˧WEB lTc&&rt{v-\jJ:mۜB:u>\f =(\hs,E8mf_1NW_vN=QƤNPym}j0 q}/ԣV8ݬ 6~mF*OyG`?#'0yuͣ L*2Wa[ L^[rP!g~Q.B{U(U^ue) 7 ? og_^wUz3>;|>k~s_`%g[tn>|6hiFeSWS5NP095xuX摜r4RR Q?Xڒ$Pgd2J?E1߸[DR!2QӯWN2ҋxgpҽ|~/P5bE׎oϏwuSO_rs#DƳz솯pnٷJ޿9/m>[cx~ nmbv%(,לz4Jii m"CZXM;_=*}v= {'`0ѣ{y5*?eh2= ZcHt_R]z1v qp;ۏ_웳au-)|3ՀeE/lCۆ?>Лѫ87{(fړ6HcjCPӱi=#(9OTJThrm{f8[> ;%»< !kESRM 7sv+KN:s\(Z fC?#NaG1q)|X A?ᢅ1 p"_{sʏ]lzf}lS[Yf+;ڒ@3_b;Δ#Cv#9Ho D*W#<ڌ$2rg".-MZr<%1~EXZrvKU} B$ԥ2fqL:(zѥLw BO[D 42xpA_I{cPyQU%zDvCq=z/_Lc1j>^~qFt-]0DǾnU :7Qa. 7 }LԆtU{c$u  |Hb/ΆôBAN;gc C9M_/a;F' cj zev }D!P;m9 D:Lv~dj {Ԏ(3bg&SCP>HN/ΆCuv0*6̉  jP |s]1׫c矿[3^ƙqFm?{WƑ O{7RVw06dFOwĘRb߯8"6XDUO=U]rS'%5jcOj X0әf_:B:N[b9TgyvΘnt6؍ByCgw^ۓ$I+Δ4KZꑴ|U ~5S+yL 6] dzw}$?Fo^Fh^‡~2U꫋&hb]$hlFDRҧ1-8˴oG MAo ^0`\TreԼQ7˛o~n[MV!~7> Aa0+ +."V.?})Մyaݢj }rL?D(#?c̤>R69_yZa^"w ѮܣĹEMqiw*&C;-?0;daRIъL-a\3g`ɶt"(aR]],%+vLHH'ؖk2'tiYa[*WN)YQ|C/hgf }|ߒ׍!٦W5gǾ8ӒÖo7χ"kHSJ *&qf)"{,lB Zb=X{BʓⅴbNkmJ҂-S39Y3eۻq̕WԺXÃ̲h&<'>t)=]<.5G^;1*PXKbGN,y5#dţ^9w+j j^lAr:H9ЪJRp.@~Y}5 Hn4Sn沴xƤ /MÔ4ᆥ ]>@ vv7ЃdF|$!&`{i\'2ZK'apwx 1}B\*;P(ࡰ?(ҶM65j3C`0cr*TvQܥϓ榽mߖ|QwOE.+.3 u`>uO-8#MLqaP< E#q1+Ȝif|ꄧhZtӤUNrtziR{*PtMXWNݼ[bPQmy|;z `;}:AnZm~rD.fVUl>g $IP6}°$IEk0Zip9sȇV[%%gTJ% !M)`2Ae*C2ḯaZT݊J)}ML̦]"PHFd }IkLONcK|3GӇI%.NըI $ tZ1k Ca3>A:hz uaY} hTtT=reLEaH*FPܢ.SLEMs`/>l5 kWWdr{nƗSdӽs#g\節掠w:|Ta|T6eNʀu)ۏ?_^CViNig_]O_#u3Ey&~SJ;9FG]j/ ]9*tA9`ӎeŊ;o T?6"&|4O^67Eh03A뾍N ±TTU8mcEԮ,2Q$gXJ+QYJ(++]坶`5"mhHa<"V{Cw:$T X}}b^h^ou,.%ؖ@Aa0EjH[Xtci,>-bmIvղ*}^ҕvx外XBMaEEYjK6嶳m;"2oFdҌJN2(e3b[B)QESRJd Wz9Xqe,TI]vcOv#1LDcɴ+{|w:B(y`Vc5&nDSkS]KZZV[;Keun҈%Rŝ/-B%H{F;cAU,vEwνI۾IBӽ_pJ˞]ܐR<;hpr;>A12aëRSqQ}wN?sڡ*VY) ~ӫ5]J識Ὗ}iZ0N~o cP eg-k!NFN}[{ǭ;WXL;B2Ҩ('#c=R@|cm;oN2;[.#򭍵(Tȉ':YkT2bBL>hvvMw |BN{FK?8hV̕Q $CZhsg@[=/؝m]Տ}ˏ8IY%_raj{뛥D3S&<<5ӷS?x'޽}~/f7i'm2T!lfmM isCHi U7&ElHEt4zGlSPGAoǓ >ECg`Ɠ料pwFSX?m18#|]IJ0S#s5d &ZɥOIGt ['KCHqЃZ3 eĩ2J ߜ[|niSNGj[k XLW7Lv|w@DR‚GZ9 4J#쵚SooR^w7M #8>m8\]Z*$}իn߾5-H_PJw/{y[A? | Bzn8!$'1xRQzJ}BQzK} 'Y+/]7?|kcaԗyݯd#վ}~~v\o^' .J>w7_af\਍G2Ƞ({b`fII\|`F*: [u 8(maP*7p)tY״͘>cZ9WmA@9ݍ/xo&+>fgqg(.ʍ̹݁_jjq:s&3jK8B]Sn|#_\(:~)kk,6q''#[ui~NC,IR\TN^5U6łǣP}H0w>L@V,S3KfЛe!pE f߆x7Nq>Zp)xR3uNz㋆R"2#E&o8bEl VTg LatVT>Rmuޟ>T􇲂gS2Jrnn<:x[9?Älz"GyR3dhy호]-ե0.;ޟADjssD7ټ]3sDe={J#TX:amҟa )|8uca?V JvRqnG#T̹B)$ās(E[Z6Km cvE`Jwׇr&mAK:e.ViC%Z[tشoO6z^)|甎92}+7P1ft Hu1Rҝ9t~h#k6vE!*!Ϧݟ=}pJD!{ecXXPT0MB( QlxJ^d<$`o3 %Dʢ!q:^O9d\aBPǜ ,"!Nߎ|rO1S昹SZ!Bݖ@D ̶p6nfn5:"f5gMNmʼm GB>c#[#k& ƙT>2 V `Xm36Z#6(XGet ' 3UB3K+U0RFX |ꂒXVX"]ϣ xq0(gC2g%Wl Tca9/6LQ)Ha"@@@+<2FT#Z5ŕOHLjUFdJh|/5nlZC;pܞ4w:̧ GT&JcjiN%\#°NP Wl%NcZ0lcWR1kmS$&rVr c$!€ÆXNb6Dmc6 T[f&j$ &(G1`B3,\D&1bAXr4 _i2̥B e5q=/I~Zk)˵bp@7;=}f]M^D2D락--ia]:T'wurh1T%mA #Z9'wǐ"Ћ (Mz?%hޕtz VTk2v1Ji,.Z<}ތkDS-6"itȍXA;Mrå kL'%Q5g-FlņglT%k5!kKZ#:#)89^s!:Ȓ< ч.k^)AO*A'nE4 FYk\skRrUJ'ML'd1K4U`Uҁ/tpK' A;&f21>|g;藥c ɼDݓ(,Os>U4ռ4İPr<"p]ڪM.D(YxϿM>8,Iua<ǬV敓ii%ʱ^1|ַ^Ӧ`#6V֭3R|1G) F& +Z|m~n;)BwU;/ԐH}ח`{û'3O0躇59{8R=PKn'w{)I\˩} feƅcAEL$` /Ӣ0T"J"p`B`|Gy?lKgFux]3 sygଋk<{c}pgnD3ؕO5!_fvm1n ^:>iز"Ht wV{Q"xԾo$!#A[5XJ21Ŏ&jH\%G*9Wɑ;K), <F)oDT4P0Cʃ7`#|Pkx/ѩjjY `S{vx'*IF| jy4jZUg$/!29iG#L H ]$%cs2 ;RvT7BӪURmժQG+*p^R̤WuE*kK#L{4x *#9,ޫڈ,T,րEaN9 rWP`O[RH§BLwx4&6%cei eF&% cSY% `FZ }4MnF&7n3?l[2;dzSEnxA-wDJ@M3cK""6 ˜ktȩ5thM&yWi4}ܦC0_2ꞿBJV)fvWR_>?]P};3}`e7p "@Oj+?9v,j-[;^0wWXt:2᪳z7pEI=Rr0 -^4:J4SPnHt{4NM:UiwXw)'9on?7 ;,bɻj| K>u=MY~VC훈X7iNRz7Y_YU7uGcF hf VZ: t5ͱ56B(zrS=̷/e0!+,h5 : 1G!-&T¾&rWsO5<-j/!4F0IOyjvKPSJVNx4~ct݇Ud5W^EV[VKOIu|Zj|zZU]K EO ׻/¡k?˸z/ʊ6Sz/<~랰o0` 'AB' V _g#Xpw\C-WS4p|S+gdX=ANRtMgu{[|ӥ*U8/.V+,b.d]0^8Rq,ܬ~J2지~6}RP-#O`IMgZ֫(k`kYFpFkDk 1aC]nVti2xg;wwت7ݹaևwo]ۿV#V:SvpԭձM.oFr@i5]<n}gA6Pvz'wVrl0œڨX8z1p~j>ph[<(&gf6ԬՄtN#Zr x8+<.0P: k݂ͧz郏i;fFp;֓-ecޢv֪(غ$l/ 8B#G]O%LGOv̭V`\v㚑J3:\k`%':5Vk=7t'jns'/K} 3P -W<$9^B5 wgFw@@97c!i箸&A?O;u1 Q5;΂iHMuaҀʑdo 6ƐJ#^  \ R|)#$1JԾMbjP{> CaC\ G΃0r@cV; 9& 5Qo.ۻao# l"Pok tO~ r3H-z6Ź܇eX<:(Tә9ղcu-]5'2sc*Nt Z5 􁚵->CҀ̨{F,tv %ȅ3!pz9a+%+gV1$=bD%뽱qq7>^bb5og76"  b&[IX~Yq6Ux<$ \:D(>H<jCȝ$:iW !g th(>xq0ZY="hd, PGNS7FƍM)UufqA 8/,RXQ m **1hTU#Qр ,GdN)p&HAB2(Q;7$t%]_jn:k@{Iv0$\.]5|F$ IP/Xt.z(-RUT(9&MdA)#Yv Rq-o> R^uUfI>PL&YSa &Q;MAY2Q',)I"RAQQ C脄&5C ͓juۯ!u.rr~6-Q[IZ j`" ZxtL;|RahB_Y7K y1z=._Oߗba8ɍw{ױtOh:|&%, [wP^Mߍngw˟s(`_xyA G;W2֊^}AxED{׹BwaPN{&BFyBk$'ULXsiF$\R.qQ$Yu&jH?xJ=1bn%7-RY$p$9EEL?5ZO}F* Lf1W$9fut>j.IHDGO8M&q2JV!Î@LLuLe+kVcWyn( :`MC9G%'KHt$D#ɀf<;:Rt,eu6.Q]>\v*[;t7VR:uVցOI$FzM&/lEx@F~JH(ev:B:A$?^H] cZz߹{78_-fX$~s[})?= ˛R/%00! ~M_ ʗPXr+t 1X؛;bZlB_ǫ YïMdsw%Ca_ݼ PDs1okjxr)H6.%$qːݮUtRU].QZ~r"opJX?~͵j+1pSRe;f: {0,aC0Ɛ!P*)e>7&$A`,t) ϲ">3W%ȣ.1T&B<Ѣc Rc[*;+&`h"&yvm28d3n{X/#8SP"sj$W]xL7ϱKj3+d|bfDpqqm!B&kEJy?ͿZ6v7Q A0"_F]ߍb Uy{j{gGex5-}.e/@.˿#B^,SX|>-) UT+d-}Pc55SySHUODHt*]NdKĎtoHSI;p'%DJ%!Y7P5W*Wz{ PX+PMՃjɀ I knZ<~D^Yh<2ZyLP+  1H@O ӢҀ![1ʛ@ODOdLhn54>SGyk>_rE^ʃίG/Go͉!S8(G],nsp{x1Z?Y \ƣ(h6% Q3B,Z ).:QAsp[JKlT2P{q+g``itL50Wk*g#TzJ)_Ĵ|k_^Zm fZW~:._~1)driRyn?. ι+4`V1 #Q̜\YOaC0l'Ex:_?qzyqRfQ5BR*6S8EOCz}(g`Z^朌.HY$cQ"v>ng0立ϯ毣P 1zEǍ+- {0nk7gcf=YI?[i=zD6{gj~U,U|sH$E̡.lBWFl>u(@ =ϲg*{9,1T40)SWv, 3Iа4, nQrT "5kUO]N$\GAu]8*"ׁs1\gP!yAk)QwH##Eʐ=}^/WνY_~.3.E=k;kuSVDwU=绰ɫE m_[_ȊD'Kwt'Kwl@b gm!-܄\NP6jv}0O{FzWa.>+]3 ~hpc,i/}ϟ矃A0Q@EV <^ǟ};[p3ϝ!|GbT唦\Ai@'>k<{΄ N 5ùnJ}my;o~i)7H7w{rQ:΁{rGo2H`>6X9hvSTV/O1mŹ^\w=|4jNL",ۘU=[kN*-ʞo^-38|O[0xYy'usgoNwrhޮlml\.\kJ^m_i6  [b\:>&;*,\͡ja''g;we@{ZyzzqtJRpz2mCHո   2]k ڻJ@?jJCBs0z @RɸTס!0 EÒ)HHy&Q w@.1@s=\*% \ Bk-`%e먂iaPcĬ1{nn QR+Mfv%myu"zu-/?.p[u$,ELWʖtNmA8\$=}Ʊ-"]9VNV :I$;d^Peso8Z*anWRώ~v#q . fz 5W:1NZ`!?3/-cLrHF`0Ym<GcFm{s+ iʡ)r*Gy{ǵok$%\ t9V_v% @Uc~ԝ1d+sM҂eaN2$ P(AH.*G6>Y3kB''FqmQEquD7{4:KiP0TД>ucX1¦b11BxA>[4-To7? W?|NeIC.' Be矝fx=<\vj$|;1~zp.F7f žƀUո L Ϫ "Ǹ]n<:!^ԭk[l1ʹsngUo_ΪJaȚۡ" ^Ucω@˰|x7ρ`.1IPU̦֤ yi""; ]5CŴ\2@Y!=E ]rȹ+R-%.P< C^/AAbcAU"FT@hDRd ) 149_X2(/T^p* FAQʈY5бKm}̗>H*7!бHđmYbkfʵTIi(5I`$B*~q+P x'B3OE#?zw, }bcI*px3BBFqݘb -щ#Dh<\}khB}[ y"!S4yPI($Rc11Bۈ SnфjELI@1ck]w1e){(JCT"@g!q&AL&GM 5BR.LQ@Z?'(Vp\Z2$y'=As,0fdpGoYSB()%yHϩ$/5y=~-`ƍve)C.qnMƝ ̨vA*?}oop!ܒ[u,ѪiJ_ (G≭W?nGiA` 6{= {CI׬b+$ hKC~C#_E9 ¶&5_0W rmӮbSa\$fyiWĂ+*Kd?FpJ !X Q_cxGpLeY08W ,UYRh̴,媄Bfbs"fM!m >K5dO[崙$R)+r*J-E!ihXNA! ҧsI Ds̻˄7uѬ4jb#$8x|w1 "'.12>}ݚ[,>;Fvr<åHDRHOɤ)VҡE]?:JJBNUU퀺Jܭ3U.TdzqVi)cy6A=*NɅR 'ʉYQ5 oP$!чq+()uÀ+(- b^(KBcPR= DU\Æsώ7OM>aN16UQMM'0IGV*"<\ۖH!M㏯_|w޻ݻQf`4WNu{w_lmu{wjOo^w_3ݥm+ o=j$+VYԿ.gȏ|]~(}EկV˥5\61Tp"YOŵk^\ x{w;w*u~f('Zd.hݦ4z吻 {r\sT+2 CIhh5ȳ7׷8-|^]UXZ0DKڭ^ov3A$~3<h,/&x Բ57?ޞKbj{2ρ=68MO12`4k4|˻_vV<~U}= 0J vT7pcfK'wN.]6n5^f9 sR3 , RN'XaHk*)+ 2Q/f/]}O;l{7rpsIms7W0Rp^'Q iR (S0RF{A [fFwLcZ gCKaԧ)Ęt&dHEIIh%!:1JAo0(:7 Q3H6/|n\m6vyofW[__nC搉=! Qp1[,n=խ C@}GߘXt6֊\)UrS}4=~\:ZQon$sru}Rޣ~ǢaE۫@9\>[?h_:eP%5G/2Ɣ$h07YC%-cLZʙOu\[k/*xxdl>^Vo4DN3&nL29lan3Bߙ\pIBabV#|rm9wVr6AG n8{M׍0cQ[M[?Oc0z$ߤknT?=8 OO#U#2HQ 9=Q8JD&IW/^+|9j2n IWWk<ޯ@FOkU̢ %@X佳uk A[ *|zsJ8`xp_"UAȗ5>s\;= Mx&`)dB>VQK _ǾP V |u/6E8)St$rYrc-cLrKj,y6P~@sP*[/gzxeod#CWuO#!ܲ[Oi#AL 7A uٽ.p7NCCJ _J0P6 o飷A @4@Q?=8OO#UEG%M@%te S+k蕯pat( Bjhf܉@-DjX{QQLkWQ"D"\`9* т /((% ĭs SQq=஽1@΁*JUeQF9gTdyY PR?"vۘMptm I+6bU(%d.Fsk  qh],CgE#H[xZ\E Pw0-<Ǻ=2d 9=^;4HDՆʵ-p,eAzQ+8#/-'=|y8kG2vݖk2fVVz<[О;F[msYY/n ˀm no1xⰞN/bǨgp//f]vL#x(m7. u\;^,ߧ{W^]gg\Zz]qf`4E4F0ݢ=g-BHpp%Y"ڭ@bߋ Mm7ّ "=+-˿#}~ B iŷT&G!P`VL jB*VaQYƉl'knFG*?9>l&~yIJEXr$yf۠$ RWylR@G%ܸSbUo{kJ@8Ő8kGN1@DS=G`3JysI>PW 0A)O2u +Bxd˧&M+G4"|lvn7jiwi6g}.w[ֶ:n.; 0[ǎg{_VLHuTm >z L$}cc`D ټ'K׫T$KE&f:6z37q5o9pM X!άĜcg AYaVtRLSnΕ06Y5#Df^[dV'zwr'cX &7F9L Yxn`3KnR4Ξ{Ĭ^/ušT>tv=>I'jcK-l.K;x}!tN b\yt;cD!)ͻ8P'rKý[ QR~ځV!m>ۗ$w8;.*?5AʼnỚO/;6wAFnFnT[ `gi'e\rZDRe)( D2Ŝ<'~м-s{3E&VAӉn ^mxp\,K <|0Άk#a9*ke&R*Xju[UFS9q[(N5s1~ 3ʙtFt-B"{콓`/h9ހUA%7΅2Z* PRh1WئpFYjெKGsϜ#)REW*FJ7y3Z,-Re<pKVlPukTnF9_K`0 LunRQsaǤ6sR(X顯UE5AA^͗T8ezSl6!L+Z_jJ@m_,VCOPaۄfعCڈ$^ٙVAǺ5%x(Dv$t JjVc(@kA(2 Sa s R>3HNL#cʹHKt`#X`pPzχzC_!xV L.ԘT*7D; <58P4f0Na-alJPրoo#͙Lʗn4 h-fm@:&g {^^C0^ o{ #~>,0/G?;X!Ab5f?/V|d2~|w;8)LەE ?~iᤁ3Ã_cA(K rcr-sv&sQGx +_(k߻"ZDX Z n^X:4&ђ2Joy찫ϋ8ƶ l][!wv]uw_FHF_l|WFh&]4a:_$>#/#3z_QwVaUC˟kʻsA/pPfB(&:ajq)!u#ZCK ScEQC3$# :~.0d#ՀE>,sx0bpA5R֣-Y*76\wl4pҺ':Jf%UGiX H Q&5tGXj`\B3wn԰lOF@qݰ[V.ס$@%B/~^iUm'/O7Vp59,AAX};|uWn> 6^O~>+a6+̚@ ux[aCydS`wGR1@pP9x(~aHxJǢ{+ݾỐ `r%cL M;5˰ ƫ@%>p [J_ҥ0M391Ҝ"4E HZ+Ld4*Uo!Ui=K/{ 9pς K0 TWTiR犾K4D\qW3^*Qzku*9ߖ3#{>(縠nz:x[ `qĄH\kv{fq o)O P9lJH/boA]&9cbc%, oc Q emETQ$ _U܏Go(|Eʱ'`d ĮD=I^ܤ*0S e\$_Tۋ{f7z t>P! X`I}vEQZ}}qqʊXRZoWz&$/YVǩ=G%W9Pms.പbCYrK_)I]&BD!_{6Ļ@ !Ej9O ̺^%$ta/nt!D<\ 9UCZ] :_z+PKhAlU ̭*"~Hm:*Fw`RCA __ӯYZsTP pƸh[2wZ>\¬y4ᔪs#D>N7''DjVY9/#RuJ빖(Zןs[/uN?G%Q 2}F-.^dQ48 ^uuج.hKݯҬa&-wXF#;2!u[Yl` \hg⑰U#&ڍYh|Ab:6UH9]!_wYUXR\H"CH"A@(-z?L;RBX #a?i`RȌ%3;Q ]$+mzC ڭA;%Zc/,qd;9=5aԁ x:LdgN.1B692ZbQa7 / )x?͇k%5;L B}yHG_F<*g{0t>G*^M $ԒGh̉C3Ax" CN4b6њihYZ ޑ9gB$bi%pG#v\\źKN 5BBIF \rsQzyo`0I6YS I=lEZ"LO,ΨCHݧd(zby,J{=oR= ԵO`OCabH;M bs<Tx%)AZ*l40jmNc%TviG5aG 㖶 ^; \Ռ h.ܣcʕz{=/Z9aȂ~f=-)(F=f|Xza^ q(~ʲH?;d1? ZmLZSI:_|;~?3]aGx/<X+&\x}>Ɣ F)%m #L/1a*%@c:GTf4Sk[Lʿw\F*g=~gu&v>K`p7t4c#۶nքN  9j;P\ )8h`|<Eu Эn(O')$?"ٖ^Y렵@]CE9ظS/m.ٻ?"Y("Df"(x-V+4g3 ɼasSԲ6f)|&d3m,ڄY^Oz4l'M=ڀbHhLF笷`H)pŻJW\[o+9999*g2RMKLΉ%q;( 9*25Z2`!\-h V6mew24 !.z''@f*bqW axdWTb71{Ce-&ю给c7^AgN`0+J~B ?>l(;_ ACrk:F(g.;x??xҞ_9A 6.nyC8q;)ŌˌʼDڦy?2 THKd4Mܪ\+{xq7-j&ėsw@$+ @Gw{9.g< `Je~ ^z='B0zasoQΘebf<8lj 2% ùp&ӛk%f#K R:#Vq8wc\$$XHEJGK/ ż6TXFڔFB}/ػ7nlW z䖛Ҁ NlNpgШV4Z ÒZ*%2bywx6eN/6A k箍̧hMsi;3Чnv  ~^NI"GʰDd>Fmzb>-(ax9f2u]QdbbsG[lsGSSwTҽ[K-Y㹡|ߕJ =?3Fs鮙yʗEwi 8ni4y#mMh`˨pfȐ"2‰oTHPT9,åߣ5-DbJ聹!N82ѓM^[Pt2. imY 0%m-qjel_\// M{-,0/9S00y%=H_7xeye8'N7uU%7i熟Vpꉢ6HąMeZjŠ5a(i2AQt S}!ݷ[5dž܍I9mf(.j;P }YSaѴURC]J:SrZ>ƧDzX<׾>PHgt[{uar q5 L#/l?2B"y/=s׏t7_yhP_\O Ls5XQӕlsKFBtn{a آ0]@L7VM].- M*~+iʚî7Z7R5fI9oB=PP6h܊6u*n~m5ILS& I館=Ex ~d p4vWC$VokiH)Q:V>j^^#s(u Y-+ڏ䵲zxkR-VH-] :us67[NVf]l:%xaϢREh僵f,&YhU}2 WzEJ3Z'R}K KDQ"$kȻ>L? 2mٗQTAZ?= b0ilEh5I9]ku0C@U'*!RgKVbB@[N?Xڣ`>& [NuJ^MGo:1c5wڃ F;/MFCv踋}/A]*%Fዕ1'&#XHa Q|zsRKۓ (FU'#*bJUæɒVV:2G@̽u'F]KH^ZR#PHbP4 A0z[r Yj$?O$EqcDDAHJXT)Dgq0$a'Px"Mo s-4Áurb PfO8+Kq`=r~疏G) DPZVd j0%d(M.Ym\7Ɋe(" 8̟Gc59L #@Nz+FV=LA7# Q _ł(P$ېh_#Z3M \puf֯ A"DX?V(ʂ虌焤q@l21䌚DP Za wLp Бd" TIsT;HsfnJs>R~4UaAQ;1\`\W#W}Dڧ /c.='cY]\0f'zv"DRH@1@(ɸ/?NVQ"3lfki2`v {r,4*4Wn+=bo Hb1z?yIe0T@"St%wy@t9j;R;IVd5?(G\u^.N]a@Օôk$BHն^`w<<*[gu}܍#̹qn]f-9w[5#cqep'6,Fiy(E2kcqvqlqh(^w]S~bmjKķk }Ael*NbJUWҼ6TdzJ A=Z\vթy%W`$]Yva vyzG-Mhٷz9>8,g Seу9bG,gNpD g9tZ:?|,Q$`ӵMa6hB4Xv8U90O.UfQ-<&arg .q]@<^0~ NEr>GQ~:HWK@EϖjH`s\`J|6+}]rfGk8ɪRdz*I!TWؗY#Uv6̄k \s`[[$Mpw \r>ͫaTbc:05{R=kظ|664R\*.HQ}T+ᬅB|+u/H&74̺FVޘuis;uW/r>kbݺq&@*֕}.ub5dR"B헤+e}hr6i0qWA~n/߬6Ql}f`e+Z\ 3wR86Ֆ|,9[FwyA[9K~P2}t ͚޲l!і4 լj@zjlv x c"99g>GQvDR"Ѥ5%A{~?$F́Y07@X;2zwם2{t_N-/cöAñ9Vٽ]1+f h'& B"$8 4aȨ R$1ˆB+^^{[jަ\|^dd^kZx ِzU{& ?i^>Kw~ Q '߮3IBD{q$0F0&,_xI& &|~;_1&Я0|1uW+  R81FId3pH':,X@HRR^xf"uܞ9y;sB7ɗ5_-cAAX`ÔAXS P(Ԡ31BFcDBJoQCR2ېCk_\|\,RY;_F8$ZLp < ۆ~&ц/YV"PTA \Ӭd޽,846.AҙCc@XV@z1X2NCUbyzN,և-Zn&jM糇Nlz\ςpp!_Gզ+6}W #D;^x7"^;_`߀ *Q1ŘDH%o .$aAb-ZFSgX=yx<\x6滁;X Y,p0.Z<$R@@uep @A>‰ XD PLsiD)ѡLEn`UVG#)pܧajv?f}_΃M>Юn䦵HXxe:{߾|ϐcwE` 2x #"~}w3|h׫x4y?'8X,MFg~YM&جiPQZ{ m+w>1+A"G. Pfk UԠ(;^PxB .G4Lߞ q+S;6>'Gw=j$Y k-ֻ3X3m1tˊF!/2(29\E*x/ދjETv{zA-*Mns;I3sĖMd8T#"_OjC&p fT~ M- kh0 ny,C^j0s&BLcwL*U x e Wvݤnn6X"%LGGA\J#XaGa#5Ho L)'uNs눂`hDf&;} ,A&H,G:G2R"zbgyrx[ښ.0Q8jY<Ѵf{K@_lYlGO)gDi`ҥ9vh#RCy=ex4Yè֏B_fOZPbAVz3`jCg=Ahw: 424rM:~&ҌZQv^#7fe.N٧Kdע&(Aq#y5(XDj (84ġHXyCRE%޴o/qPKE~Qi~95o "ASJX)2򑊏!!7Y䪔4VzOՋGNSD)sh6M zsbH$&bNHBh0F|ɅcHTȷ>oTa*Ε!neBfYk6 ۓ] Aq)9wGBaN˘dRC#ddaJY:QM붟gTGbEm<|3xާi6甜"v–U"@Vz\TEՐu/ ЍBvW-Y|ZEFٴ^vˡ̯ULց~ 4.(u2JbZ&i4H)=s 6`HI&8@&͂-HNjįF8=|0`7h"w|¹:&h^L?k)ˀaֲSP~Kz,=k4!.~ui/OVDa5#1of8VSk~@0Bx]b}fck}8ر]c.QX# u,F cאy)bhYU5ġPӽ!b(-M?!]:Lߌ;G]:-~5iiS]7DӽOYbM5 f{}#>BO{YǣR/WX9y 2S~ri'i&[)_|j8 V6!eM|,5c{5t:z77d sꂾ,`?|7CPll\n ^͠5uU]HW?0U7]3,E gM3S]#9 Vk\3Q=֌_گ`/+~b o`^o|[zIl'۵x$>Oswaq~7.+#~io J>dyzz-|E|ʛ"OJJ:n0@ W7!{g=!hIK-60W)cVGk

!MXFk汆5XB׊j% 3#:,('n8$48&E۽\m>~=XJ%T:Ow ]߆"W_N+w6o:Z$!!Zp**y<$̱56 ߦ?eK}Fo fo+څak-q9 R+tL,{$WkRlb1K%OH~ _rBׯo|uׯ÷$ >ú:4 ô){Qq3554IHW-$\܁GҖj"F1-+00Di$o:Ɂ"q%䑱ӻ(X5-`^-`dCfZ?YEiNO58o|!_!Ļ)WjW'iKZ?!ګW)^)^lx%5C%!@TjbZ'AJuu #W:3\Q Wk$sp%қ3 3~V.t|wJgvM;w{OdVEغ0Ύq`;O't)*KCg Vvnfy<6 yC(U3km:J,u_}n>N\q6 b2[\}7+pV,m-T>k'|`w֒fs*>SkǾ/ E§S@b/o{jnަˑϣ_.-ǭbZp2TZxpƎSS.8Ut2ݿ7YӡAvwްIkIq&Q<(='_˃¤Gݥpw !wլל>ŅO 1#犾.3[-ymOopTR'rՎʆ%wdxrTFp\GIqTlWTOZሌP rn;J[ݤO.<+1Gwe뽔yН-wvއ/D Z00`ϟn׺ċ]Ĺ97ٞ|urOA0A%=!"61`S#m JVc1'ikR$Pa*ȸJO=[EmV)K\pw3UE{T](!r\(;f ; ؁՘IqkZћ47b玥N!fYɻ Gl`K{w}ߞoearnpă;H.n{~c{&^OB$|iuW/vS{;8ɦ`%6/ "(`/ _ޮNUŒ&M13Mށ$ȍ;XuL>!{g5I R C5C$eM1GU5 Z4r1𿟦)|ΒN[ɧ$wf&^6) B.9`= jK:`i,% #&8$Fb"Dvid{iWRRjw}>JJXɁa.\qA@vmc6Db}j` &E% kH 9T* KfQq :[0GTq֨jE`p 0LADB@i@&8K&>$h>M϶N(īN7@iˌyʋpa Bd;XK:訔)F+L"AC3\qHC?Jh:"8 v6r,5VU2LT0gaqsy5QÍVc .8p$(qg$ na 8TYPTAT(F1)HӖ@V@*,F@F;=X{K,> F )F!(>c2f*ޚ0kWnhh,'cp]D]}a:02؃jcbȃS%mFk#{yfSkLaVcO T@ι5\DEH TyXHb8\0F0aWn^6'g%4BSZ>eT'j ,E dm1RI k9``K=YZAQ!XxP P1빆Zi)``b9'IEi81aHk:` Ik.| üN~[y+m+KЇ$/D``LF'FLK@Rv9HIGGZN ARdUZι[9R(2Â6&<:`jLH$*p jXgͽcr&Ԣap %%fos_I H?$gRX[dO09HڰlKLDiI8$ka՚ymYՠ@jp*lLUT>LCJ~*bzb~.,QRXi 5 e.xg&Q^er:Rk 0dY5J^"[& ]аQU;\(Mןh3`i֭‚q3hfE%ALҾ*l5BHЩQ[-+m¢a̰x28xpV#+CL;C>Ee"*R2:խ1a $5)eQZxXVeA)dxLŎ! ~UP@uHX0 H N@V"`o{jh fsQǗiYQRV fC5m1fnV:]Lod4.֒ D$ H+4AI8$ `v&nV<6u#5T[ID}A-5`rP_U,us"]*-hAc>z*4`.1[ 0J0\t`@HgAĀirtkiUA*șRd %N-.U ?FV3ttg`*^}|4ެ>28)z`W3ئO@ QB ƄɔM"wP kam@ e*aZb9!T5Xya%`DKcPat 6"S6{+zj(EKB$F$#y$mxX Rx2JD ։Ckmr9s8,@m\.%:@\p6ΣH(hBsO.,ai@2ij$y _<؄EWc!#kd$֢IV8xpkEFЁ W_qEY7@k2iUM[1bRѼàF1z*m5%+*}44/zP&SDVKh -ji2+QKf"9S-PG3ridO'6ZftnW]j&Q:}MD}[h/Z^oj-fֲ[DYkW;?uEj3e0*Rڔtp|q% 5,c.ͷÕ\aL~wӠltH=/oe&>vrx4s1VѨMV2&1I3=mQ/oF2^pهŇbNֿ~J[Y*t'o7ƛӼe[Sj ~_eA<`i,~ME^UIgwMJ~{N?rq~B֩NFbݠjQːA+LZ? dilr#,&dO0bNU.&w'n3%j6/>;H ȄɟpqB=y֕O,iYNʧhnq%~(Wdۥ[ܾă[0L1Õߎimi3蘖v0Q3ZGͪ$[72Ͷ0̷ӭßno( ng߾`.d#At OsG7i2\1/ceܜbH2cu/uôS?&5wVa*%)(?[Z#,Nȭ _H'Y3IvtA'=y0SE<\:#h=CC%^BU0-ѡrrhoiyOnJfR^sݏh㴇_.kM[ !ҡ(c{*m7re>zs?uzٱzJJ[}{ czF wNL8+mI B+6TaQbu(<0㥰t2"*+WcQ|}eԳ̌5 Zܜδ"& Jn)neF 3V|s uva[*kXFVŜTɹo4]%sAGghV*玶R7rIW)K$|Y"GU̧ݓ0Yޙj1w0~}BK.f<'ti직_TT`ī%LiGhXgM{=Dw$V.M/'c_Ox@%5ޢAiUX,I#8@oםjdWPSdN %,D.1?~ A/ֽ}fv`-s \8Ih|`KICBR3z"!F5{=z&Qzf伝V5ۮzX(>&b?w%J ׽׳w%\ P_sp>8 ##JkVBG1W[ 9 ~YCCqr.5:XjijeJ*tG0*͠JZo[O-5 t5< z>v?"SY@G qWX zf3 gq ,7LVyD|:Z^ѾЛfvw/PޠBcR+gV銼⒳$;ډ6 f4^JioNz2͜Fhiޅ9l RVw0uǃޕ#hU>q~aФ긱E)Hoy.WkZZ5Iu\TLK˄87pF lӧVW»#^:t.wc4M iPR|6NWj#ݮio92i GEr6̯WW׫EuR.?>+qapo=;v3in<8_02Xrg[rA mMU⬯On EZLl4{̐Wc nLչ-Bǫ^<a(պ? FC`,:? ,:ϻG=N7nVh~Qt'Fu秓W?4!E~!gn5[q giꣀ{чMcT3JY{;GGՙPnďV4ȳ}yz_q9륌VG545(ͮ9\7mjC_u݁=q,{fՇ8P_6V/1^;<,p&[Oǵm$uq9| ݃#)g;]W%݈ȯo)\s66:,#WD2=شw埞(ߗ^#aP|$nX|!>č1JGԝ+=[8u].= s?wrGsZtQnC&w#jI!&A7r/׫[ZK~1G^|>v] -E*K=1}F6T]#Hljە4BLzq [K/ǣ6 }ř;XMgM nSxyTXؕwK?l/VO-kwcmxp=tf].Oi3J-Ѐ/M=^?D)=/t{Vw&.{)uh~uZH=_nqֈL\ˬJIXrwmmԚ/C\q\q}ؤTJH&AgIH@ ݨ\|==}M $W a@=%d6Z=+)?kݘn4g;.n&:^{-&z afSuh:!Ԃz`a?O2t=E0ROI*kr(ZP;/Օ d. ty ɩc_J$;٠u}r:X.BK tWoƁZՅ~ 4DhGӽ\|{<i8yg,9'q4_Δ3\}szt31!6 V@v98AXd1F$ T_ө08K|*OWaRF+c&)B46wnk0&lm:$Ch2ne` 'N>rA]fx6"KD\XJf\rq(RMaD."|m]Q~ pP6 kBĘn$™G:J#ДZS? ΏoO3({R;/5kπւB\UJneqMkA9#r Ƒib#X_Y]AAtAuZrDw"!~+"C}h%a]ԘKJ<iqsjV3#<S%""x  Ե֩ b@Z?XÔQE»Μgպ@TmD1.G2`cu^9@6ڇ$Rrң>`ncNqr^S˜Ha5Q`pK3*,H9/@Ӈ[e% ,\<月L) 0'6kk@ [?1Q9D 1sOO -8ȊbwϽ^sDyw REs5V1(cZȔh[/lx3lpXҋ[Hn4bQ؟Lk2OWOwH!XE+Ľ)3qv8PMTh$ynRA@fiaPqW;0upࠒ: '4k*)Ūui!(j@/8QƌQaǎVSk_Tl`Ѯϸ8q:.Z ˷&H96s%qM(x0^tn)5! ϼ@^t~hR"IV7̂6=PmvլpXn\ n $l-*^_WkI]uZb#YKֵ2 ݝStvȻ{<^v|V ;_Ϝd@ K N{6@6<*?i>>p,ĖHiն>׈Ә\P!\.5 IШbH6Ĉ3i_Y"O^2`c\@Q{=<_F@r![yk86Bb(LOf-q L 2?fN&iAkXD6zboB ڃ(%=mkH($M>:nvjB@Rqb`Щ* ǯoj ^u}8Y?A[7K´qFL9ˁE+ˀ򣇅 &$RWGK+5#$:I *}0\p(*DAZ<+)~h6~"~]fEtR FaU$RATIKđjs+Z_5apr fKka0xNdMxIBxt1nWL֋[=Mp` jx}eQ AJU2&i|~O|ׄc7AU0,T/- 6x~O8dJGtz>UJ~ē})u~=4"\p9o; DG%`b<Kgni3q,sz#-Ws+뛟⊔l/x~oxo?;P A@Ł~E/^O{f7P=%su+ǩ^Rp|ܟ ncTKJID:T/I TI  RI $39H2L8H^ZuaB\hdQr 6*L hL ymY7lBvBF]DTJGWMD!tIucIPW!&$ep`we!1W{7{4[oɁa}h/59 2YAzfe9#b!CؤBy'ӡ;Οj Lr 5[6B0>]LJD&AG F"*{|]Ti0BPm$ FVQӽ!Cb,'=#(UmY5lt u!̰L6d'p_DQ~f2]dR4k룱_,1xoխ2pSil?r&ɽ:wK5zDZ h}z{fC҇08rĽn(]t>L-!{MgD( ڗz!IGj'RU]I\EZ~)Z-gWλc Y\d t e,0q'BM+D|\}N?WN0qN/;cnQ^B?89Jjd -jn[‰.8ZTܮ>tr &׃c-0Ux='U_zb#u2N^tVnxN4_,ߍ͘ud %pəa WZ"8;l(z5zt,@)|cvI+~69 &JpOCLtx}$Xj.#pߧS}&ۛZ~M@IUKh8Ut{EN^yv'S“Hmͫg̋Bީ}2rtfo[.oOZdZV."3Zɑ|NkٌVm=x*^9C~z{;];H% Xܵ.E.e|\,^ )+ }o<{ەEjǏ<$,޿;w9g{nGR*;fj<ЯRKnB.a8{gw^6j%3ikmKJA}t5}f[DpsmY7錋dmH gH;5rF~.8}x Q(=0n N׊!FPp|ET[:k8?7nɨ%ivA(xt.oF r4DcNl ˅`i^jJ];ɏbJ|?_O%‚xt &c- F]p^Ry\ҹ`vd}Ȕo;T/Y.x\ea1˸'c%B<l!,zS594"tݛ O.H|j8vQ~@SuI-/҅L+ ;rVBf_`LsRxh ȇ$pG_A ֹ󖩇(T٣ Yꑚ3@HM@.NJ7Az%څp'(-]1D+qlW߶?Y9`53hVNh ߚѢxAU8C QFHiq~P$RQ~VԥTly4ۍv%(EDi4bA{'.`!0bb.%.( 7~3|r1*=JөvQhiAh֓/FJ&3^85i G q265y:re2'IH6vX/}%56bgo&7n" fTª#)&0t}I9gdC,Bl:LCHE\8'#Ջ&)l*FTJP9>[y[q= x>|\Cz(@/C? ?ӵj UlGKw쮓_-w-zP? X[ G b]ۨZcSWvŕR5{[n4K(9`(p(. e(tEsp" ZVf8j9p^Qj5gJ](匐,BS ||.SVB1RBc$-. P7Q%~R~98Xʫ`4)"3~/QhHD̀ +$Bǧ7?t )8Haiw~~U~HSy}0s W &g=I =4$\ȡS +ޱ ?7:K3m,@x"Bl?FIP\WTʠ6 4Q,(O(6mF0Pl[$Kly0`Y!Us/^?nԛ5eus׿ 8)> c*-j WI)%՟ jyȟޕ$Be3[vއ?4ZE3ۍ/a "6ĩ !XFx#掋ȃ'%D ST%RCd @` ƺ‚h*-$M3 q @G)֖*1^`2Dp36TrF5ptK;M( ֵ[FI4'pzH8?w0x瑼M3 y>]r-=[~ .?O?r[=zm hsikݗw aUj2s8."AW!>Q!Z+W!>U!,zljE~9wd˫/Pɮ{Ym0oypE+üR"\_v/. m sCG (F;|[ep.Q+2}m6$ϒ1Z\\?&pq.VYD; zU}*zՊLҎdE:JM`[.|]=AW#UӕX:ܡ^ݼND: uZf>MQ Ŵ{WԶ(!\U#YNJęb.DN]U{^m=կ!k%R5&V#g|VUU1fc]>gj} *}lλ^~Yno] JՆf+O.D7\iWI܏g)Ds.4ɖ"ދz-n9ӳ|fp/1fGg1x>FjBzҹ|/={1)fz-j.0h)sD9sz:d''5SWY۵o#.$Ǭp+q]ą+{X`B&)€?8p,Cq '#T!Ys+533{x'xX+-Ĝ?5سb1Fr0_^*{Rcdn{/bĄ6{uʜGOt!JfV4?ϻdtg=ɥmcL.tu|)}Puk e=OBi!d#10<޻[B0ej.dlijnJ[1MJ}A|K+>+`V-óKvtR=,duγu3*0,#%Vq1dS>a1DFWL ,@[R2֙Qwac<I~IV[Jmꖷ -vcX9vn_P 8&̹^,=y27,fE 30`UbaXG*ͬ^B%T=3P+aO^mh@N,ox.E0 E ͛ѝ\u"p'ǓMoZ fq}xbbq}-F7 )vէG?5瓅l?tP.[l~^]`w|u5^53O5$¥` n=xWĨRRj~4=iy %m 88z1]aA|gd$%cT`wCkX$`0җބw5֍O?o$\칵N?~Hgd s;50܉CvV,v Ky`kw׊8H _cՑzߛMiByYBz $ή.&4MV.p5K.wZ-e A]ݜI~{\}_,߽),NV>c(c5Dqz —|1" =G{3>qUƺIBWOݦ7lZ]mZi =+(^w\-!CSV>c 9h7Jkz@Vʃ蔎D F^w݊ nm CS>PڮӐ[! c|C SNJ Z@ hj7[q zc|JtD] akL "SN jBTF$6#gFH%$!s\nicB3C84jS!Z%iHiQAFINgiR1IY$S{ObeIm CSO >n1hR@t&m8&ɔ޴[1ڭ paS=zA :c4nCF ܛv+&4W"Lծh?YzYIсy% ȸbCBŠKZ*t^abd#5G/'g% O Iĝ!4`<`)S,E тY @=B k{pF;íz!(`B^8D0E-C&DZ]ʃ蔎D ?Oܵv+&4W"LI6zvB :c4nCF nm CS߷t5=z['k*(*Ky1!˯8m CSzŗ9OD{xLZQ{;ZŝEK>(e-PnhZ x]xhԑ ^rlfC9DJ%7L,4k^`<ůד 4W9 e9ڋ<+k_;;d2e B&n_RTKPuKC ~O2:;K蜝ÄN #ʳӐX>$&H2M.ŨXAJA鈣pZ7\T9<_6 zs 34˷+Yb3Ƴ5rc>.K@;+"8bh<))L؂jQ4URm^G#T]8Yrׇ`J^PkqU#5e0bށ!-|Xc][o#7+_]ɼi%,d$Ȳזs[o%Z-b[- [-6 ȪbRlj/&O߉ŃTɦJvvsyv}8i/9sFxùҧ3˟B1rX"WN*I~+/ ~3_G)"dS^8tOrI\:WqJUrb0b2T QFI2OAO X)GwI3!QbUj[T6%ZK}`KQk?-!nByѸhYz[4a0 =w0Mz&PŶ|?mS۟s: wcQGFS/M`e=:7O0 =x=i}1|<]gr|g'Od2rDoӤ6] ])|)t0MM~ܵ qPy-MH1"mZ$~S##3dpK,%NM.Z]GKgZ6@KN"uunFV*18wW0:-cb, FW>FƔ2HDz  (%bLc)\v9[ yA ITJUvD DP|zbk0F)Jθ{]ۧl%EkpCF(LՈJ=^ʘݪ!7~TQ 4A<HkFtYJoNЂ#>I2lsf3m@atW `G{Y!fp=%Lo>D7PPk E\`QgMxGaeo.%i 2J1 X/1ˮb WƐca'~>Mjoa]\ڏDN!10W ':V~E eF[k]dDOj6Wh|)rTRHN\9r@p=re$TE;(#%x?ۡVAD${.$ ;ݎdzjèx#>pXOkf r;Ԁo#7 " L[J LؔZJd` ^3MxPM*J= Z. 7I-#8 $ʒI K h>_AE)w"s\oF~/KgFpAy}mhM US#je|15KѼJyD*UU45NvziЄPD9: m MGfT"ƴZ՘}l6;_XZM0ňr*p_}Uᮊ4*aiA X2Z!"r8AZkV4EZrۗפ2EFc*h; J_VOQp+W/&ExF%Lj5Gb q]0&p# DB-t#E+8pj9h#iȫpc㵭s Fy76=<^j!\ 5Q_ $҄Ib h AV9hB8jG(>~C[,V u =lYPqT:Cծ[^Uy科:C=@%=;W]DF>a4z LA 1j!{oA#(7\F5o#7 ;D-Pá]*GQ]BhOV*`[$o"7%E4)t:D (-<5^4~wl".L뵞9U߮f6Uy Ӧ7y rxP*f9ԥJS:@Ͽb_}L|,!}7 83 V Kق#&zwR]}{p{tXFDtO`bgXXgS#&P9#Zp\P4PmÕ|\ώոB{4s1q%OeYvk.-w8{yR'yFm:iŤԡԖ4Q MN.M a769FԪD̙L3#--} ˵֡$6شZhE#!a$Iq\blny\7|erzEc7tWxʰUy9Ο#PPf2"0V^=(ZcTDkR plD&RI))Y!eL9I54Oj^N[S /k5g1* #h𞢌@x"X\ 꺋Ð!ІlO5;fH$;k\) D Rm<-):2&,DT]QAHIԀ 01-}CN΅? ʿ/RO,0-;vrifC*@BX4 v2 )@C HqЙr3ɣ"1Yb;f y;i(0lc|aBlw,U&zHR5 o-vs̔Ai}F^MXHMH[#G v!̀x: ^gY01xy6>HsgL?=֔u.rz7t^lt >& +D`xK4Fd":ۑzC6`kǞw Hq{v(XG[ma(zG7 bPCCjP( ;ޱZݧaZzl S׉w#9vwR+wJ|PZR4g!"f?)fy*Uϫ|VfX ;9tWHxGbP^Ƣ!6lZͶ} L3z8s)е #%шđ@ī4ji!4g,U8W)(+ ]EGXKO3i!M!PR-5-u9 x הX+I.{nY5k90Jpo?;4.nz84:Gaӡ[{U&~7.obsi"obIHu 2j834\NkNF+oTBQ:]U "$0 p霉qNuˊL-&k诃_ƓI A5o$z+>M JqYPķ|xk kL)!hZ ǜ%헊/h&/|d!jWZ})ێeG8RB@tK(YaeR0FOm `Ҧ%r=za[$i5=`t2 'p dTEtx'<-QR1 Z%Aj W!9睋R$\OMF_+g |zOHdLfb@k V%OQ|agҫc伡cnu /?pe-ig*כ|~rK+K:OI #'I 'o~ (}JJ!ADq [mMh-I>Dk^N (^XB!D0BRD.2;v_ Ԃ0Q=<&mu>h;dXTlyZ4$"˱Fyd3y_[{n-thKj;ưW7^= ^%&-[SOa649E3edXa<-dѯKp;e/WjTbo%/~,hӶo_QQ~mY7I!T冞aCŏuW$a}9 }$鑂ί~7sqVN7Bv~#X5a8ywRTJH+?4ڨ7yp]W; hÐ ys(9a.ӕΰV8NkbJi*ň(D:TPBLlzkaW,}yf!V]kxzA3B8G 10  Pdkm:󓡩zf8fYi{1bD m$DjR]"Pʹ(^rtG#ihQ1=[T3BKΏt!xAm1ﭯ yk޹xjo#hcvXj'#3]zxW?h9cǓKvǓӽsr4w"y^.5NQ ?R3+ Ě 7ouF69jM]8IIC fw6k;+8MІIL*!:."OX$DD ٝmrwύNl[8J"3A/{WH C/x]#inm$N)-II3wHJţ$l EH$2D82R$5J'Dme X@Fd&&맷i4T6féAzm"Qqtzd鉦3G`,~}ZLTekBh4-%V?}n\esȿX1`.vUɉ?}j*qa"JtvW"yK$jM~auBk0JJ4Y5pVmB>nxȰQM*# ݨۜ1!;RO}ɤAX͜l9i4^IJt4 CKJ.8ꄉɖr쬑V S5{1_Fd]L;@E! "5M9^)rL90$GH00J "ˌ `Vl@F0Uqê^0$;ác<3p/֙eH)\,aaa"MG۷%RJB=G ,ֱ³rl0xO5aMR'j4i:<8ԝ\v]E9%T6[7[H$omVr*״4ֈѯmoԄ^r`sug0(&Ofn7;ssh~@B7"-痙˕ ^oMu4^GS{]6bzq{Lws(΃bPʛ $ 8G2D,NK'\@H䏚4$& zXNb Յ_=,bB_2O֗:}n'ӼqK qdT]Gwu4zeWexXAR,qc`8*C1!"ndɢܴFܴ=P.QLwFE:YREzb_G:ž.ebX"sO8QN{NN F2`Ґ31\{TnQ#n"!вiy7vWRJM]JK<ϔÊ)$p)!+L Q˸3|,xJ\Yy3~J?)v9B CXkN E,02CYC ?!(ն>_*v̦l#fbQU\ fVzLYJ4$AZSه:ުqVPt~&)I^k<~6?bL{3iEe _|+==<d$/-X,Wh~|-xYSsO9<xllOJ׏]J ЋCz[6ɣL6I' i ʉ(tLKpsbRx EXx}x `#=Oq9(;e$Xkr&  *}`6Xmd/ǹ$p cSVxOFO Gu0\L|wW+I`5#919>`nLqi5U]G@)g|I "?u4 % ֩:wb1C= 8Y@Bz9z 6\` $M4'k_f_KǚGڔj o (m{ {NbH$UJ"{L š16EjH+,@('B!^GDIHBEmX{I$Ēz_HkGxIC59vR)b (Ն)h?BRϴT͡=IYGZ=:Ai#7٧>cH\ *'{*# o;VGa&8X1Sp"Q:— wv%SH T:dnsq=hN* &aTO%7nfMO読S*=γW\*88*WdO͈ZT^XGY`:b950 1!9c2 HLۃC Y:Vj̈́4-=,&:h@L"i)Ky8x;mLX0KTtISO '/ ̃XS *-M-DmyO+}-+؏cݦpS"X_=湐qX"-6qMT'wfYۼӂQ x"~"J Exvr=.7gN8V(8"=tAZς{#B Y}qcyh)p  Q#9;=V T H|G{l߸Y6(e@ɐZ*0>w\8d=42 6 {xǷrŷRUJA 9bD:pmN#Q@J!05X{BX(K1)àI7/K{"".4e`ERp;{8 XT ʵ4Jɖ?'I emc* DHqT~ 6#&TRB8_QjB/e96IPk+E4SFXGҍsx;)Wym/{}kCï5xͭv}cF&]J%ǟx g$ xKDպ̣-[ rQnSĞ6a]թ?-m¤Ewj"S>vPMsh#bMX]1 9%oٜCٌvIb@nKJ,|8 `a`%?^\=y_UQp}x:ޭ+t 5Tҗ-42BTĢވt{Wλĥs>X,tFNvxzh8G4#PPT0&Dұic;##^2nF&l֥*A#JKC϶r,ϲ,Ar|NΝF 2 |"PE#AE.F$j<Әq{* P,b LLcCʨD4Ɓ`3&}M,Hj'!${iS^etnڥ[ppvv*ۥoaBTwon>t2w5fͦCY>Øٍ -}w+5˹勇y}٬3͆Xe4;fƒM`!f;RX/ff)dI`c58B=?ܿo]?VZr(YQ zsr >6]EGbܭ]\yQWtÁ-ܗD13#s7ҕ=̧⽶Gb;O1t*.J`4=-,{50.j;oLwwʱi|Lf)-gfy7 5TT&~ICQICpUmǯ%y3Dzd8mV{w0 ~D o%x;Na+ 렭u߃[Ⱥ |g80uܜ m!%R`|dJѠJ~A 5㤿7U}'C-8Rct6;Wۇq28Qc dۑLpL1L(zLs, k?:=X # PѭVk|]]y~yd 9{H0=e>;lrdь7 Õ#c cQ|$x840Ls=q =?[.X<O[byە]?6G^:DtJD{0D$b_zrC#hNX ,9Spen*}fmkT,v5'HG ӣ W :`Nja{:uxQ$qyX}CPS&UEݗL lΖM2hЛZZiq 3ʹY=g܆@RGz&)MiN6WwmW,-~OpN|x[sʤBRI g%kًܢ#JܝyeΙ3_>|;,zFֹ(VI+c=Pm m6cTwho$'3I&Ȟk-hv%["{sm.{( ͫ6qFye"0BJ8Fhz4 ͅK2B'ڔɡW 婆c e?}8v(z!/D!)*8jl;Gc5[?Y|1 ʅb_ %B  ~Csc{h`ئ8c( IρqgKΣ\cJ>/7B[m\kX Kd}Jw Q?ϝB!H{L}05w ©%l zb?O#+Oq`H:pԛ5f9)Dm}v oԬ?>N`:͠:Ɋ_Cj>{u @C \"yf 𨮪z'=-~3 "⏋Ϗ_3TT>M/^Ջ,Wkc£(G֡Kd/.f191be%20ZsL{繣1L aG(B@U9f` %6>5$-\xS݋)JE/],]1/?@oUfn9fA^_|a6XJ93,%CT^;]hTƥ1-#5sN_!"FƣXN,r6Ry#C3(ῳKsA:Q wDs>`C,Jh:IX9VƓ] ۂ۫o]j҂xVnuy'߾:9kW'5_Y{ʬ |`faJwo2=X]Y3_9f=oѶgo}9:At~9 ` m5Xb!JkWZzB%e8LZ" O@zS\*E/KXG@S4]zIuNIU YOыĽ1S=ۙۚnD`Yr":(Li- q5(*<=D EW1NΦ qc96() bNS,⸱<ăi圧k%i ȣ7:D$J;Eb=fַL LHN0PDYihM`N`}D. ͑QMkԘ6*JB8 ^<'f4 ڱO! Q`HGm i1Xyf"x-BD aEjHbGzuԆiHOnJ9A?ideGYy.-?)G_v|e9zqYW*`',ےK+%ju2rV> [̾pеS"D5r>%DzH{vhȹ1r[ys'C={ otH_g6dN  רf kĬD2la&2cÁFУ G;66ăGƆolH0;ؐv!4>66$mlHN<ؐ >}c Ѱ!%hcCZ;_pl!4>v}mlHyl 3A;ō0`CH:zJc B%o#%ڙc 9ޚ2] =ؐ1t!hcC&ƆL裍 $<邏*6Gvwq~Oŏe"(1F1:1 lg&DTh}9cԜ/~Z߬.7Z] 4fQRЦL4@T)o:yQe&CQ%xhX` 3Bm$#%%g c^tzoE"K^z Ɂ[zϊ!A\rLX 3a>jH)`1NNŏ'Ū1:?1׳J7W͞㿸?~PSDŇby3_'zX&n"V!IH:?Y^ YRrD>w[nqsbKt-'e>]،7bHѥ\#Jܫ5]Kъ@Кjgٞ#]KSfvr[Fv7]iTOz;ۖ >mC][q;O:ʷyWA~1/pynAeB-j4lw3aUɀh2g`Pm'S,$]Ƣ݅C;Odu =bQŦϴ@%xy喳u*v\EQlhM{oj[v9S޵0d'$='`Wm/[c5n;њ6vTe{am/܏-qgjzVjUZ wY }v:z]o9}xE9HMF"XrԾ^RN\X|wNm)$[)&;DT=Q3<@3F5aHU0X,Uk-+)Maȋ2hK*¸tB"fan"Q\gt k'nߝ1y-OQ NZ}^ꛏ{ѻ񃙛aKT8}aJkPYw[f?.? ՊW{ ?f ±W_E2-ZiѺLe}>E 6P1|(D+N@ 봔^g!8|Ԛ< Szب5gmtqLYp-gbЈ .{q-W2TaN)a r*S;ukf׻i,s1h:ǻ ݖd*Xo-a^CֆfT~7nǻA̟wŠc6kD~K̆ޭ 9r)Nߵn( n4Qw݆$jڛwozл!G9:Ex5ލ!EfA?{eL/3nk43mAC\EtJg_KngbШF;֟Lݕ)Cn\ߴL r*S#z[.MT'xNJ$ݲ'Ի!GY:ٳ?}ݔ|-F[ث0܉*f=U4U4G4ގw#F2wŠc6K[Dz64U4K0OF- 6,5K0jtJp:%ت%C x GKNÉ2ЩZ猢c8}:%nè)NL'Uڵzh0*k&өTC WZxdE!F?RtNV- <:NfSemj U֘dMV-A:*kSe KJt<[Se(*kL׸_GSY\6Uڴ.+knOLւ%*k#TYk@]ecTY*kZ|t5Semj RU8!L֦%*khTYk4ӣqʰ*kSeMKPKcL֪%0D*kITYkԡq.z'LcN 18 9U֦Z )_eMh(oV&8[KVRƀ1)cK<+y~8QBR!Zi1` $T_Q$,bq} B5̓0iOWfq].~!`m:OFYw P=n3>E_2kR\$O&ga 8f>8F[T:lTa Dv#XT@Hs-WRKpLs6{JmTJ)=/ʨvLDNh*^c!HA/pJh"bCHHةB%!}J2I0Di 96E1>AZ"ZK)4 UEN?ϯKBqm/a Y;?9,OS@EJ߮_[҈:R²ԀզJV|'̫-QtO̰2h(T9ab dB!O66Za@4WA[HHIԎ:q20"_=7חWfҸ*0ňD9d_fiֳż%T0y'joA.^}IV#^.Cg 5 8Y~|yʾ\ pY@;Wfh9Mbϛ> %V( L3XIYw7Gx"B!drT:h Nza&tS)!v!yJSJ+!tX}PAU!k:_80gadxHvM(75a%$DGb,!fNgB(!D3b+f.m@}$[XJ38"@OK *Bp4HC!VVXas5Eh"\<z-*C$Gc253N(fbwk#$` ah!D@2!B#!ErǐBtػ&7n$W2-6CzJ ؖ¶2UT7-^&R'Qdw/:Ȗيd> H$>McZ# z2nAH#!}`N~1XٔK,Ǹ1u)΄8Rʌ0nA#!}$!_Ia4FF)%"K 5@ ҂ F^dj#)TBmԈUt6I7ZXQ9? ##KP7yd@Eݷ^nR΂ZEӸTH P(FH47W񪏐R]S3bx vX`X]x)QFLMdo|zׄ;4L]ϯ޻<>Dϔ֛a~s {֟(6@%XBA%,I' S$K ? `=`/QSK‚6aD&DsÞ󦴯 agɫa f *3\S "TG/_lf"$v>K䣟(GHYq[o#J %xEA?Nzs3v~Գf }?A{?e\I1z7zqRR sC~`V4.6/F}L߄֧wb.7og:BHűB07f7Inc!9LX J0VJ,({; ]e B,8Ww.| b3|APd(GO0s$qz};A?[=45ZI=W s$NX;bD:4.|24#Pp*ű>@a;iY~uURqd]1}253aJ"9X` |fÄ+8,\xûHIC- Bc# =c)9LIS%"m-Kw/5<(d j8\` Np 3 _ѻ)f-J $,p<-gzqx/GdaԖV)e'>4^ k}tG}R$jj?̷`!gi?m5XD6X&h;RΗJЛpr1Mf߬q$eN+S0tb JȘ[~6b^^)zUFLiX >kp[[!w!?M @U9jṘuz 0%<`=0sq0dNU9g7h1lL•c c)N_apKG2rްwV{J&'&҃Ї7*);7v{t4n FoY0joh7m?v.OwNfQٙ1KcsQI\M^#$s>7K.iWX\[Nrhh S5i]Nuǣ _FDKZ?cwu*pPTȫxN˭ymp:r `a8UX$ KnV(h8ownbc"8XzI FfapFl10cަ{E!i8MEN.D b.OYI$a#3{zy&K?~})3}_zl*DXn ~?e$p xg|>8?3%-Śz/W:SZ.>\`BsM 5(F<(Iel>]nlTw/fv-Ҏ80/NI:s?K]&h14O \< &okg8W3}YQ WS˄0"/nR4=U9O:.ԯS]sLjZb;cQKwOɳƬ\bQ{Ͱw-d+&Y˹.d2,9SxBLhiUxįI63'q 9aE5tDky̰I11̨$D`!JuN'p;^$KNR7.#.L ]-S`Kֳ8mh'J4!MکL Ri'eC[_9hT@S 1Է*i &X7**nKbJؗ*JG S&ПifwP%XWy-%wB2|, Q,Zff?(Ȍ4H$Rk߫*Aԧ23 0iyh$NI\ EGM5%4 i4I#;PvkH3JJ\G*—{ĚGD&'(HoϯUA^ F:Kt_mC䒶 3UqJ;kXg_?͏1{GS+,ht~F Q tY[gPIqgK;wyΡ̱1۝![3Kc i,\9v? ;[y̰LӛՐ\Fm$;-]!O}Hd_H脨3Qq:z',*e6}`.Jwp{*}-J[=E%&gFEH POR"nvnbϣT$l4t]:9Hp9CZ;F 9D[N&:S3>b+s,5};˷Jb(L=\aCϹL;p&*eO9a2$00lM&6I! gNog[q\;8@Kvw4N]x:'-6Od@&Q}0P)mř4rP`ķYj0kCu[LN )ܤ4#q 30³2zefT)d]&' U$Helr* 5: *|t-Vno,f3˟^W/A` e//GW '!U/Yz) ǚǷ{6\ Uoh# 7})|W 1]YS$Bާ:{lwVN"fpScZPKLIP2bD#uOã7>e[~jHԡ([SORqt*5w3ևmC5x>dZ$ %T,:Tw !q#n(67E":C64_Nch]q^a};~[KL10nr#Γs25YsKR.D´%a qgE^#vtS%ۼq8f|'08C ۹pvZi8I'P(E\IKO3%7)ZI7FKCS5Aj .Zbc'LEtv:]BN.a# 'N8b5&S&E{%bj"+rm]6rBrvWB񴧎<>Eww*ՙ wr/H")Z Gn r(B4n+eryvzI2S&0`6$[F)T;VLQl iȹ,4`˪X0"qtYBFA:#boi Ɯw4. ,i~: ie㉋a{Mߖra^> &꯶Q/.xEX꽘,nww& szvHtudh~ *t|FEr_*W,Eh++y"SQPf {fv{X-,ƈjȫv4((Ȕhy%ġõsX񲨌H6̑Pe\֍Dzø(ղ~y$ʤNm^#&hVN+NXh"H7 "Ҕ yԩ"Ea|㵺a5GHh+ώ*sŏxg8GyWvDž/.<*ǪJLM8V z۳BRyVv`9M?ˡapnoT/҈`$5ܼSu)FU):Iĥr,/sk#!op^#K=+K2N]\M#dw [a(e'eGLOYPiOO(ITwD=ee))a-g`rIܥ.Q'H'% YFð&fV *ҳOgݏ%qcm' }3Ay[wm~9؃k4v%3gllv_faca=,g2O[[RkVwKDvXVA"p"@Awق*z(=rv%sXg}SɁI̡<ݳU۞Fоp 9KwFY_)3gmܖ  킳/݆=!>t B%8 ;gR!ڌ{ũ\dn(gBAt>zXsf:_xqH=I`4kmZZxJSƿThlՖ̯ۢ_R+2*2 0y#3DƭKK:YjRNuR[Ϧc $Hg6H2Az<X `0w 0WZ͏%W䷫*TL;/G]X6%wn0hh׼s4A?M7i<ތl Mv̹}w\8 ڣDY\ *""~I[|8Z[HE[.m)VqZvdN"|8V\j _NAfBHP hq1ZBbR/^"+/WY~1/|O߽}s^K3ȱ`5 |̌Ds xd \4y1-RbAC -4o>WO?, NxVuX|}:!C@Rq R>}Qi<>D ~]J4K- Pt02Ǖf`|H0&字Y^yyQf1(xr"mr4 .4Ji7 }Dz+zxg Qx~A%s|K(T6yСeDt} h.0#b KLayqEl],Y'\3%H4V3G #t@ Gaě rX,1HhAiүd<AAb0Vp WJz)ͭ* )҇j.I<V:aj]ϝJ3=l~ڑD4X'u4A5 yn&x< U +p(Yd2FӉeVOAvHM8\#-^Yb1 CS 8bA1n<8CHf֍Z`j$Ez k=8m@́Dbb  TQE$ -$1t@&(f@x2*yK_Qy2T5LEb&%[M7^Sn}g|:ee irSɨISASĨq%Jh)Q?v1*Ő0v[h H#.A==\D͂( L҂xS=)ܣ|9Q5-mz6Q2yMլVhfAJMHyb8X (-z5q"cH.&'*@Zr"AAHi#'rjdsb+JaiFӎ MDUSg2"G+jb@HpShP|hCl|i1p[-T_;Y(|NEm-N*2hzuz:DH2`uV. *Q0:tkilI^,(Rk9k3!D-uʙyzn }mJIpcGeWU~n7/Z ^'`hͽE[p]*4>V?>?tv2tƌ>ޥcc5xC0wyќ7^{Wgl2:lBh3Y68ŧ;*J %6d2g> U,ZRK>KltY*e..{vFm(Pz>Lṳʤ}vm-bJmF8Qvk)t-YɅ89>!p=UBE)tphӽdţ ] oҖC>jl+9gI Dϭbi_^ǨFZDD{K"p!HJ#PBL ńr& ճ̳"M[ Sf{3C&p&ArN(ȿyWQ-Tؠ,R GG&8<eqT&jB)#H%].Kz! `*E%`2*HyyC(\Z34 j R┋ZEK~\R`m*>0&sʴbq?J𯯖!$Eŕ$$mTaz7-곌ъщI"#Q(͋hyoFQJR  Ѣ>>x#"Xý~z㐩͒zJպPuVjԓ?[(ձ(Mx?<%Kȼ 3XVbx3\K }q"fd1g/֗);ZG}:T0D-qR1 /oc2XAlv zV {$}^[L\udI[LU¦CH8TY:Ίu"&8! U焘><,둷RԓJךWvwF_8pzOW*h_RJ2RK=vfN:m뛡ީqD4JF`Pj^WEt@tTZCAHvuuy 1z=u7AvO?3x*D'ym9R2E8TZTc1 Jܹ[jmkpFyALՐTpJ R-W k@PZT4l.Y=9| *+$XfrF= LK™&~Ǣ(qb{VO;!^R:9Jo]ւ~Ho u_Fۋ P0/S^Fv$'sVv~g;/r!3QdC\tYi0 Rkd{/Bq;S`gm?}uڶH{I~94_ s=_p$(_>}O31Nr |+n홧_~ k1i~Scgnf0ϞCX7L .<*!eS;SFa(ȑ}~5bD1{k e!Y5WY l3C*dd+cOFߗ7_lYG+?>ى`#؆4J'N)B⪡k~kxR6׭lxDl ]mqmǕY|}RuڐٛrB*.Єk+1>85%r>+ET*KiReᵅXeɟyDjDJqq1 ~(ZΠ`H\ rhP\ˁ.°n eʁB±n8th Kf)&R. ,q Za4`efTl4'AdDEPc'I+iAYY+7>gcVFĂ $XΕ4iV ki _-SUcT2kN=|-ƴeVj;Z\OQmZ +p=1=YWIIf$=ǟmZJ{W Jm%.?:F5󴠫pˏPU?Ogkt̮CI_iM?0m5C\vmt5RVWz9S*J*K]\][ûǜ*Y uוX%$p6ɧ>Ga:k{uMbʡcRP8'-X"L:NCRtJ6vsҝzL?BCS\ ݗW__X\E5m!חf}^qF 4UpY>;p^=}p*6Oy>EZރ]m1Rh=u:F>|S7sm{s86bM -3R9c~J;zkd0GqDPh Y01[+C "4MrYGRZw)lr%l"G$.)Ҡx/rMe9XQ]pΌ1t0`>"lH6–L%a(a4żםE/j+yjaxFԟȒBЭS/<|c$j[^F!Qb)A51h`$E=)(_:L? 9~z,\FCu8tU8mLiWo8ZFE[rCx^VH2qZv5j#]JOzSJle+T@;wK Qr#$8yȰ],ƹt!Wa݂l6: 2z&& 3"EϷ_ӳj! p%?\VL2g}Wt? *|Аcw_FFgIXJYߖH$gwDy]$^7%.XZb7"jW?l/g0_ I;O4f `|gk'LNǫ#&*BJB e#"$z wZ;jHSW~$LS}4/}BUk3y[׉kqݡY Y&ʓlwJ#ãa:NWpz__vRruaL#śa'O2$#@bu9GKQ"SyTC)!1J)ېJI^-Z5H‘MŻ9A".R>uAO-EV"DUeoUPeF֙HekFmxFۺv5U'Lu-go{8<ǔoDMp'U C 6nIzTғ3HGīGOZQ.i&+IDEOF5/"k"L0G}q Qf1c lFjR~Wu@rfa,z@6S }2eU'+u?%BIoy.C<:džS ZF:㑇DŽ(8/|+s,NCS_tљt}e- ݤ;/wΓ_cAր*"*]vl,]l#s^o38LXbdkc xKƱDFTRea Iz@-RAUۤ&d11dMo+d D߲ s FЅHC33\!bsCʆ>9w95x\$ z/05XdLY?_;2TL@==y?_MXwz3]jIơwg2$R%^G^߂win_DDhV=S__5&owo"\.ξݹ})HyIG]T1)"?/wwRjjZ%].D" .5]&Z3E0Jؿz¸UDDa Ҩ7 "CuAĂIC J$0" D+T"VZ!:^%#R=-Ӹ$NinM%(jID.{5S<tViqLިVJ熮4kją\'NP*x˩'#36"8GxG')i;M7xg#1w6NY V27Dj 6V I63)4ƫj W5&|Ά]k\e]10 A;S,SF-b*[eX$ n/uEm`CS)+?ͭȍ-=( 70TA _}sd.3 T l;>6jV|׭mVY,{KxtovM+W?MS.jd0t8&+ N*N.皒{N>4Ga!Q,i0tmYaVxO*ˠ4.Ҹ J㲪4):$kKqS(ӹ8/XapHSuhnZSS_BМ!zpGts:!Ecb"l_Q uH52/9rV+$ )xxX8kͥ;E\R K21Dt9'7DE) _*(1"zCB("W!6LvDG 0WOS!-Q29Si2-tvKNsA Ƚak6a 4z\P.D*R(5$6\벓Y`eT2!;#Ni4j~q\v`!&vK\VH 7Jcx\vB1_^wA4hVjd'D5CuH,UiѮFX"Ѷk@maAfm-.lS?ʓ@rM+s 6Ҥ,P7Wwi-an̰lP<ҳ;ﻝFnWg7{'9"9 ^YꌓE.$[p?"buo⓱KB#6hAPiU_Trq%T`I4k!C?-E$0V˱ݕ@=oBDwɃ2 yk2*SьQ3lqD,^<8 _2޵=<z/U K] 6u޲1f!ujKzOOA6fUzJ(( z ,6}D~(-~Dx{}1G{"~E|۶XcMޒ>pf\U.'pnrㅵ>SdV"cMw\ R`jKV Ⱦ5͵QlǩJ pkc?[g&0z67I%y{ʫ+Gc]X.>I@ց=0#匍HIH4TƲPr $0\,{eogdn q[? JweX<^ J=!D^k ~O%NUTfiġ6Xi`Ҫ䭢^!rBK:#)Iy"WI.۬5Vml ]-qؤt Ɇ@CCWiQ,FWlk-)DCZ$uazDڧ#fѤ,(/L*r3COAMf=%}߿h݁ L[2{]놉~)\St2z%H2Ec)_PZT.^ d WrJ!pJ1.6R)&jj5m\Aq$:yﵥWkmJJ6 Yw^ŋqe͓=Sl4Fwػv~cfd㔉U&Np~&;!8LBk[`Rcnlޕ}1˾4CVY=*g j04wDgzHr_X`yIf'e8l',VK[7[-GHǺT?X$Ef#mjTRuEj2ۊ"X`zF{ B9-rb2$CCit"`t63)Dֺ2<ƀNK f]7΅ e g]?qԄ43XCʴӃʇPAfFVo׷>pU -֯gM%}f%ޅ(O 3lƌ7UW'. ^V9wW?4 oc|fZ&%U( LwfLIpIo؝H#hΊ3H0*dყtNˤ1;N O;?f!IYK5!3Cluukcj1e)t})%ƯS EYYZ#U <P'vZ9_#nhf Pq͘ҢDCEɉ2lnz~+ CaP FʂSi5-yԫ [O+~l3_Y쾧%ĵ 7Ti !Uz]t37 kRMȊZZݨ*{f]Y6"#VWߵo3h"0JaE i>KoZ~j;uȣx$j;țmx bO: #! eͧlAs0;nʋA&$(yAj!D.7NzEu5`E>̹[ GY# 2SՉMi]N*h4^"KDz Dz\t$ b0b%`>RʘV#("KiM )"((^RYc1vdmihnfG(+8HMpI'ov8ɍaJ.G$]4v&kZl-z7D۶eFJ`l_?Iڎh{dpQ+﷏+G-z}6͍cLat20D0IY< x8MqQJ\VGm# U7Vي7n8 $j!pOj߫K4KD69ڢI)0Y$hY0Z3#K]Il idr`RBJ䙃`h ǘʳ!P0,9 "ޱwH :YZήu䀦rﬓ~ͨPo>U/ԗ/^tv䧴H}ŷ*#{3^ h/֫ܫWO`I$^!uS G'ue,9իn}a!4+'߂nԤuP˛gÄG\%>q-V>PY4_6|M͎*ڟll9 PzXrz w7i2}/)nw~zv]3 ގ4dzg֠rt޾ݝ'(-Rj,l4x`%]jݮ<;j`k*[ݰzS:7˳w) +-^ú{^RfdNʝSr8/iExPj M+w1Tz3j˼,NjtUDI`k8orWWgǟzB$ؚD|H(GwL~2Ԡ=kd9 v{>m{jP̱GG"8tDX+B̃pZM_r{~*P۽Q?; 6s l H/CR̓Ys .GL`{,[A^5ox6y!f?$erCIxꛣS}!7+XZ/ 6Oe}unlMNJ$=I1TCY~w6Z]ѫV1o䫝zH,74h3𘭇Cx.("yM[գUF=hjۣ5GA/cTu}`|/73B'i2lG&p6zO9w1\$TO;V&q^̕bwL JMQadJrs]]4uo&D?OүP0Ca:g0F Q BU8 ,x"r:9Nw~>hkLd׊NVuN/HW_(p/+O[Lw>} KUrU*bV]gVwmoƧɝ_~UoVdᰣ~œ+!B)vYD74;iύra #TC3Ӕ., 2T耡 *e,rz :&b eM υ =,ir4.k J-hWFGdYzi3v%T9Mv-RA`=ltvg:% wDg+=ctt!X,oJgXհՖvd4V4mJ60XS#Jgvভt CF ay89"sVtB2v͘3d.kew .cil\23gt!)<&1KʗFQE{4W+:\Tb8M;)f=-s)w6J4لүںҕXLt""52ītV&_[ɮnO'|D E*E:95Wg^kCma$t1AJ͐xMzT(h{?Б֘(3v:䑭KlE09G@vzq/˴^hېбmWMGjfm.GEU!KXo}Иh Bh]0~n~,Zz" %gau%N @tR'Q{Ay jmɑ^(D?x\qRqL`ctsY4,|ωT 1O:,6 (9B,;FhQl:bZ@`2VI-78cj]|#LTVՁ,w3QGrs&Zfb4`TNjs%C܁Upa1ۧ+zb0Z%k](va!H7 tɑK,ˁ.c>! A>ش@shΈwd>TW 9x+- d"9,ymG 6P0a+ furP&8&))xPk#"'c U d!7ŽHǔ]DZ* Aj1f2emN hC2)2yMg;.  k^0ELV1u1LVKoG4+FJ=y1kb$`NpceI*ȻIs4=h+ W6&5<LL*ba?E*@h+P NLd袆2* :vM`$,@a*e93Hk9) FȧX TW"'2x˴hX'A6"ZHH#H63D$-na{J\EYVnjnxؑ!e Մ2L!F,gqG)f[\L奫m|ţ0NY gy8 Ps1E,jhoOoJղ6S?kAfJ?-4_P!4:nn:r2z<(c*LAc ~Mnx}..P#\\08{{VѰ:g ގ?w4x~l<}q'L׿d&+f.FY~qtn9jȲOl՛{W/8UJI<;9QbTyBfŸ{[JM=ur{w}J/ ^IGۙMcCqĴ ;<٫ΪV }v֒PKg~! љ]Cvg]|Z9}jf-Ms>ZV*I]bkwm͍)ymK6$MKRS$Hز%Y4gӐ-l$h.v 62ui0B_ט`cefɆU}o|AI eb$yMv9KDWYVּ; \;ƁCrZV]xεZkEA ()˂*n1Rڑ7 zF<"f-w:3 <*LG6Si:},,2sM+?L?Ɨ448²JXU%jTK@2kI[2~@84{|m`GUT$u8Կaa>8cvØ͡_ǽ^B=uF }6q&LCO% <Ǭ`F(xSu<3SB)KJN۹g%Wc,r#x=7\* :65x>u5/syCyL ~@mQK{8)ΥiRs_<8N[3gP+GF8iq(}Sh["7%$ mTE%q X5?u6GMo(P o/>[c[PIuh!uY|]/-zG9'||χOgդ'n97O Oʛ^e?>|O|G&ՙ=ʼUv7 "ѣܣ%ƕD-0FOV:O+[wwN,x/Krqq5ryആGA itit6uZpV0^%zw3UU 惢jC:u*K+"hÙ!Pf Ø18랣Ҽ1,gR ~d~`jzoWMա` fhG,İ s5wMcG|g/O`-Tt2YO.}su1R#zjE)$kVfQؓ')he8oXY@$5Nm*ucC˶ِ5}/|3F@&k >5k1΁Ysجh }f-/l׬e: sKb T2g1k ĭU*+#ATUon/..0h}BF; I1z#y#AXIS*A1S˙HJP id(R=,r7c#ZGy.KE6NAZ$ldFuZ)ݡGf(4t_d2Xަ]׃IJ&f ;&,'B:)(ԜcOK쎙Bٝ7%*P *{LH\riM+$SW^T#L'6+=3Cɋ]Ԥg>xf `s ' #Nxr6.i1o.a3d[E5$m;G$gq8Yd7MɛShv}bgy7ߢȰa?eX|cx7 ڗ朂+-^ϰ`+.I˨YFEr }C/X2\r܅R_i܀Љ9ߔƯ0 R38dYR٥\&G|t7@t s:tyY+"j;4PtN0Cmq͝ coa8cX9+sūFi|ii㿒7^1&nu.,KOj ~|o߁7^x7^atw`:&fgc1KT5rz<5tVB:\nήSlrk8:U,a=f',Sm95LV<-Qq %q4c4U*0:_:_ R߽Lɠ /c0`CDv޹IInr],'nkq`W0hV15J21&XЄ a Mz~ MrK5BA\S̝?8gzw}~,|wyvb$+squ=x2Iź80ڂ$륛^(.Lx4Fu*e*g6C0AyI|wh(\ >&`Nd,~> xN31]ھMP8p":^B.حok_,plj$Xca@&10,'KΑݕE.=)$ao˛9:>M=_s߾-Olb>AtIuXi/T]|/EQ_prPue||[Nn2 xiM׆VguPRwOm#_qC-+us벁M% NmRyu0 K=% О:Ē%Z3cŃB K<2iua>831V垦doz'aتPu86QQg`T` 6[/ ԃR++iǕkw;Sơ6PMrcgkB49cee9MtQH W? p-36 (\*yFa!2KE%H1Er;x,Ne1j2& t>QWKD9eO"Td0 MT 4#)?jZIRޖ2xye\]\]\]\]s$`-C^S`"BgD[Ma!f /0b$B <DZ. aA1Iń^0Z$𭴭"_pj"ߕc8pL!)U&,","\kNE6ފHhy*`p[M$\*CB[&%*#s 8sTDFgyЂ9iFQ賖$ #Bi"dĺK -"fZ+JjI=j"HPDj:ƵTa{wbw甔>cV'&`Y_I$8VbFǕsn=Hҵ15SeK2%9ɼ/6jI[6&;@mTPBΐƸ~X\Z]ߜ9e>M{ExEH. 1vH.:$!-FjM+X&^ T`$ ػ"Z9Ă,PHmkG8~*L^Heஎu=F$GRr < 11(~ Lǥ (`8~ņ+6^A;[(ĸAu ,\ASd GQcZm,G h 1Ryd"fEaP%xy-1o3 uEM''mɅzt7汨\UOh':EBWjt[!EXO^cA"HpʙF 0 2!g@CGwz25oۛw_Y&( L+ˎ鏏32 c`O; ) W̖j,}| 1NW׃ߠY[a?s'pirnonw8<{Bɇ߆ {_mnmonڞCv;{[;7luwz{{N? wyv}}pt]aF٧7sЍ.w{ƗӺ2u_om{o;" L]]wb+ni/ok&˺g}{tcL~2}5!d97K,^.~}0~B;|ߌ;O \qsptGwM0w9GlSPü٠IdVG?/v.Sp(0<~tI7&On_Ks`41hL'ݗ?EnIWGMϦz7V蛋#L{? g`rg(g8=\׳a:EGߟ|>4.,?f{kzqb:ۋ]~ȏx ?OWe{\tv|dI{<;av95ׇzxtayW{[o{?٧<\:8;yujpr0L>t7xb4W>9ӟX/Ǜ?ppv_jrϏ^68Z%Y7x TCu>ǠS<=>:i@n`NVH-N/Sf8>;DB?',L/I'9.t4;ɃU+6׌eeee9F+du [ &7Z+D6d` ! ݏc'<8q 95)M`s7̥f#)٘ s7_ZkgtHT'D}JG1F@tʩ@zK<اю`dDBTd n< nu`>`]bʍzXR`9K,@kʥ/к29.2U$鵷rQ͠53CܦeQP(pIcT*-N;ցXnOL nuh@A3eԻ{`-BmXtDd: O)%rRIs:=r'm(."w]WZ r'.(cǩc^Y !SN2%Mx^r UZT:l ײ/dķݟ Xi9cgXim0G *d S)L2 Ì\ 7s%.B^&4BsCK^" gn~/ V2thj]&7L1o!vٔY=?> `x'h L0%fl@Sн2:dЭ{Ҝ.fs*p5 pR\vNOQ*AK!Ğ/niIHY}e3B[>U/ oI`FVVŃ2DpKH6#A%8T F7 1JSqTklBRϸdDOEi"M7  g,EiR1[4> Ԓ|zݦ Yb?k[ETa >81~C6"tԘf-t*8m`8),y#,Rva }#Kw[眐N1+11F*"@ B(R53L8}83{ l`8e5 E2NuȘ.3@@;n&]J|o7UJAjݪ'B-&TE;W"-!+Tc/㒯ZWX¯&xM˛KR1Jk?]r+t#]OwTDtY}&- [ȅ`*43I Vme,jVJt֤{W2J CB".]ǔ5`B"u,wVXuԸЌ+My"wE8/k\0%(%TqQXZ_V-g*E^*x>UxB N_utRxi]frĬrNKU9S Ht7Ԥ/Y~_NxزK˾\`NB%iLRGW}T>\e݉.5+YruŨ"hY2cekϲ@UjR.t-F (Nj𜧢O;Hh@^qb" OB.H)K;/u"a Z^Wb,[XL FG\EwEd+)/YB-Bs$10!9I WTj3*3+wM$UdϣؿŘĘKME7olo9]T>:u;z:m+_m/ /<9I_IzsnRbje,'K"@Jc$m( vI#e, 8]&Ļ_]?~z]W=?kr(Įtm瓇4ݿ5$0ͮC*w x^^M,l~;;w &Wɼ?OׯzXV"y>yf;J\ђMC$^"X CZ yvɇqaʼ65_t/CB>@8i^ YPS%H>%הjS"&/-@iBdRVmZR7$RB}H"A0"zvh"zӛ>W ޚ]BoˏW+Ei#4%6ݔ 29M\"  UMғbƈH.e,)H!n~(H}vR(]\E}2-JiR`HC%*)G\k- ;EsV?˯e-I 9m;KIg Gwl(zfmaVfTˆBH-&rc WE&rF"jInj|vD S ʂܔ%D J@1?H:!0I S {H0UєFqBvHa(:R\SBu!ڐD%K^{#3ji{9 q>fiv)е0ӫIdՆh 4rED.BU&JDr&΂ )#Y?U*'+\ (aM-7zv錷_{o(sΧVSP)իj=U/eAm$:DY ڀ=;}xdwLɡS2Js_C Ґ]!0hFY{J8,>0cЍI{vz.D C.'&)Ad J s1`f)IBqҡba ϸ}wVI=Qe]ANau]S,WhN.lٛO ͑l:\ޏHNMK쫑;Q+x I[ǭǼW otR- t#Ɨ1oF$IF=CRi\}H85?MtA6I0NQ)BcJe@ 5Yڋq0Y$ $Y0ݜIm$*e$ɃjdJ9} 19zۓ;-1&8[QZC(M-QVtۉIGN]鵷?.F7$oNY*!($%:45qԭ ׺HMDHM A7ޑ+l ڣqh6(|p3&I4I LG֞2]'zaҠI' ( Mr)hntpIZR0 (w;!a'fL*2)vN6$#b/L* .GI(}h8J B]cR0eu}M:'fEbC2Er97'~xL&-~Nh?=ռljrAtC$49Rf j6x~oRlh7_1K1/xAE[d.s8q;Ir…ʹO宥j!&arY>L.,ǕRvꡮ kˀPJyƃv5 6\ Qvw?0̮no9p}J3;F84-4i7W\jkLczV ޷|[?__|v)B]m ןbagM'($# Ѧ 7WWgyy>7fO@4 KoCBx <_R'{ž)N:t'_|!镘a+YM!AEAC]SO.UbpHW!{#l1's: &BGcP\L`w2+1IYy[d^M~jo5MP03]yMYzyA[.mSʈ*} (QʈTsi|α5ʦmE:ao|m)4m]6YܠDR.h=1hlx̟Kq|f6mTƈo:V1@hw7|gķwK[3AX3G8s:^o7€^ĻSwN9ޝ.ǻ')g6H"障@(>hVUKIUMzzW[ ׵{=6ݦ@ϱ-f}|J-Vz5=- g =%3[ |oI ZX [2=dj `W4= ؊`Fe6j$ц7@+A?Tro)FZd^Eg)ʹMNyw;xkةG^d%YZK}o8ɶ}62~->"FۑC,%Gw)ml$΢8^J_~ׁ預hA6YgO@*EYK@iT",[_9<.>wu ŗOrYfU). ڠ̡Aq 'vh M ".E%4$)d:¤ڠB].ܦ_޺"*WQQIЄ؇pݞL(^%Wb2A(!+qb B۴>lބ~il<.ByjgMιbUL!t}0MLIA\0kQMq Ek8LC14AT)@:#R*@2HҢOclM9Ow+&8-CjD׆rpK2 y=lT{*dT1` Jn"ߤBg,̟ ̃ԫQ,w'HAgJ4J$j1b4bBC5PVHX5\)d!2-W0j%X`߇2J GvDy4q^#gqN1;1[sl(L6ggo/>QS#-%?^FJ$b_F!-!PP&dV-PH9QցR҄gz9_!}YƉ;/,ҳiz俟!Es!)MSUMݺŚ!A!HWTZ֥? ӆVph$`S?OpdžKCT ᠯHR^cyu6 4kbVPtH &Gb."@AŢ2pyYl:ڂLFa$1V|¶XF$@`X&AlbR"^%O\ӄZn"2dX ! i,44R89Eg:# t | ۹uTĊ05iLBd&51SN6]*!|'q֑%F ?DmMBȫp#oMs ·NK%qp[!" 希FreLTP5k0h1SVЀ# ,1" xcOjY5Su; 7u?|22AyX.1)b bgBku9e cGoYN)|.,ׅnwg[ɹԲJ'd pe&TIQ{cSc!nLHN:YV$ D-"9(],F*dDDH/@evwMI,!tP)YR:(U 41fZbVJ Wƪ&hX؄S#! ZUCdIQКTl ]o-eۥT9ݪ|wVUlƱtCpJev( 3npFQ[I-*n% T" d Ud2 *qƶ`M!q5dRR9#J= vhɰ\qW-`1T(IzUJ n(.HxIQY _]S5@o;RGȖ T|e^mA)tN%GJ8 WM@P;d2SC *#!j$h<.-GA 5GXqǞˤ«2 {3 É7VaJ҂1J`%N,z*JgNBmZ͎J(wͲYt0B(-w؁$_!wc!(fr[X m%~|[$~TT= ~`Ȗ$A]ڪ`Mx3`KV82=%ÿ{wW~}8?N;RFCyݰ*A ^=?sESv,,4|)<=b:${?DOk))12F{;h=/Gii$}>xu?Onon(Oo;n0 dxt|P>u ߐL?4ɷwu@ kx)ÚMwU#%cM xB$,rΤPL\{߱:UKbtpbə.XS,7^.<=.vص}h0HpQWh0N*яqB O4B9 cR^*8b]BuJIZc2uWiDW.njjـ<:S1- y"f2a mw* t8`X`FKl>-N-Gyvw*N==NFBɷS6XgrXaߑ._k@  6 w㯏BZPr4qqn~@ƒ{0[tq=0z<|\?_|x Tda,ߕ-%˂6W8RvCx8,3!jgjE5 a~5+1~Oimڻ90Y#YT!eYAY$rJ7^~/$CZ6贈8S@D*`QX ʅD{[<3 3QF4+mNk]M t[fiv}YEDGqʾ4umN:+7c{`92Zw j4+#PCb"}57]Yk@ƂNd0D: ăotqn:}w5C4}CIg;JR)-F؊ ,)-\oC8-2R<6H~{l;hz)`&]s ')~lf'VY8Co^mr(Ӄ{X\?B ۾On>gR-Hn( :d~k}rȲ[wp+M}~N8襳{4jk;l1ءx%1veYs\G ǎ\Tŕ^;g,xkj^ )|ݴF8t8  eJš#1fmsY +s"愡Mc~L3.Q*W:V._/j gUL'*,'g!E nm,jU m? ݋R8zS/ 4w/vc^{r uF_*ҩg.Ѕ_;<·hĥh~.;JΙ͓gͥ)I47\vlda'4Ti4ME0.uF+641,)5hfk iNum2]!hLE;!& nw/d-d5_Fح3lRyoe 'TP~+ t8kDvh0\4x4Ϟ9ڪ&%%3|଻vo*5&v;JDkIHNeSZL)l1Qbb@/Z ZJ%S.GN5 ?] &B$ŁSp$ JQx(dAyCe9;^{ZzIY/.`=qG^[sEv&(oH5!Km!rO@~[ZG$oHp-'{ )2L5V SAlۨ_QFEQd"LJN mL0*l~4iKJ#,K܏g ѩH9kT:؆ b1 (dIg`n@^!z< fs2$>{X!b { i5gs! (5 p~.5]o(HIn7758UB {EjjGr3 ' el(i2 h8-hD*nDD$Զl55x \ [įAÝ&U60w^L[Ի(| :tGGӿ< $_Hs ds8fuv|}opbۗM :T#]Jڹ/_T(16buw٧ lTD’@0s(\ĊM|JNYkZcL~qǯ4A׏/3ŋ#%ϸTtټ`kTcm|P;Vq8 f#O&W_YpS4TjQӚisKe''`V(-a2$iEl=R;TS^&cs e%o B_*C=CܖZ-LC~ObS~\)^9#"z"%[}`Fr}qyFh[ {+6翗%_ ?݋{kR]S7Ptpp oD>7|zsA{9>){9K~pgLx &{8_<f0kY;}Q4~ ޼¶>+kyՐêAq}8VZҖd .MT kњ<"ŨKbMט6P)aTcSTvTOT!;u:c:@ǚՍ@4g[Yo*hzX& \\EJ3886adʝ $8!(KK!g\*waTUcfJb['g ^o::JZ81t T{O{)/ȇ0g KξU c/5(K/(eQlZ Θ]vQ>R 9iE*yƥDSf78rk5s+zfה_Y붱:qUk8ns-\ *:iNOֆG%;X?pZ6/j_片H~>=ǽ<;\X@{~l_`# E+,*3_gQJ=DKΫNq3M(?7pjƅԲ>ZO ?iM+ڹZƲ.'ZI/W[9TTLYdHP|I7gcW4V!>$ 3ruzz`i]⟍$;g,ua 8_1xr9?Ū諸Yy@YoR#SlU'%lY/^l.}Q,XV7̉'\C,QKY<'3vW(^Pa͒bvtfMrbwC%PUӵר]XTO賘lw3.)S5F5Q|3upNdkEt}cekP9𔐶]R)irkÝiQqHs@hIq`"]x0}곜i}oW?:Lz_Jnz z7Ի{Ol2 Mkq gJ\5{V;@_zb* S$wAT='P.l/| 1*hwhA7:ph_v].vmur?vB8 ^F /Ep-0gz8_!'/7~A]$/gAmle%J)9"T )Rlr. G_UWWUWW6rS3abn(SB[ aH ҲV.bWHsFذ6F[ }BXOr`2x~< q )[煨0ǐԁzNJg!Y+c `*rJ+',w |6^6LH`o G& ^6.KQy)h G(nRx|>w"}rl6)IQ{ӈ3ouޅAGd"=K=ƝWI;HKodbOyGhM{}\^dxSH.|AG>q"7{r6qA eGru~tLxg`ָЬu(ɱ\Reנʵ|Dӧ,GV&ǝq!ҧLEh eN#) @/^2ö^F-u|,@3VXTQe J\nRz/$}+F4B.mODo}29 7 lq, k9 z\e5O{!udAA%>}];+Z@9mRut{30{Wx5n j!U톊~$(>dEWf_6Mdej贷-@l\dZj4ɆΤTŸ2~F@ӛs?YΧMMz͓*0y%ͻL MQܯ_\xS9Ⱥ6X%hC2=-r!(@hWY /bjn T]+] ]e (m 5Nþ`2I6gj O1ԉǁ͔&xyuƴh p!t,YII9OM d + W9`PCVMլ'~S(gh֯u(ߣ60[sXfe&ݮq* 4C3_Ȓ9%J'o]Äf_a@KsP׈û,}5֡!T6CVC˒:7mҚ1NE)aa$O ǷЪC4mV1n{l@eo/P_ FRYKb! r9# /ތ?~\#%ގ[71L^EKLx|a1A A[.!#tāIIDp4\gqr~#+q ڮFWD}M8kV,"m4FUS Eݗ0rZgM09ŞavM\P2e5[D4L2~/͐JܵnEIpxsrVC7IZנݝ\Hqn/n'9oETWX gq{!~myVgl<=˳T7j\s6C:4I Rvh2x^pQ"HE}WIr޽&iJ&>eymŌCG4=a1 M,`BUv[ ʚ Rk¾CFp(oE`x@Ϩ37zAMB_}8wk봌,Wet@ d_^'Œ{~փkur6/Zx6T[G&!þC_ x8d.!;.ޝ߾kSf2H8^>|}G ۅv;g1Sjoףܴ)'7|  e+=\OH#{ }tI+dm Փ:"`7@[=Ce=̎8"dEI.?3!K ɥ|.BhڋB&P$ΤZMu|zyzȤMŀ59h5Z6D⿖I4R>`2%o Qk#y< H=KM4k+U$DK+rej~ .84T0WχoȟoonP=r?>uNo' ɟޢ 8 _1O2y=򡉬Eڿ|:h1"LP%a$, DZC8& ᥲiiB)Mt郆@BΖՒ0&9LPPs 4>9Q)"9a VtU1FpQzkR3WJrT 1[?ELd8Tb$j@ĸ໋`r9gQb,KTrE 9$B:,G ld<#7P_7+yĭʃ ܩڲ ^#}|gT7IǓOx7oz㬓w7!(I$EU&ç{+T77uaY8hE"mE- Jz\Vo4'v#8( D_@ꋥF @f 06ZD`,ZcS#%7.Puz.4hxZdk}E"E2d6U9f(lY;B#p#nO5Sڹ㸎R^qI,J&xbRx"E ޕ\pMW4"ca6UlG E+|UoIUAU?X,Ȟ лuW:ң4WH8pІZI+ 7CĠaIẓ&6iQ)A2ep!DXH\b&PoiBeoGzDpVF);,=Ym ֐Kϴ0X`E ) VT`9nf:+J#XQ`}c é)psւ8 PǭhAAʮQQ^S4h# 3ݑR}\@ì)(Ҋ[/Y9mZqA%=ht^)T,qy0HSJ`#b7 %pJOnO!CZ.k'?NAwidmj'T w/W-ФIeb:,FW΢G,z΢G,zT=h qа {h-C%_ܖMj~nx@SuYtR\\&]д &w~9 ?ǰ!5XM4Bu9Ѹ Xo@ <5*=^T ;~׹Y ߃~՗@DT~ƛ3l5[ ={28tq| T^u8J鐣"?}Ѽ{f=9{%rtz- gĤlj6$[HL7Nr0>be eNO\ vT I~Ww^rNO{Z~!5JCa)'{]˙M(82[i+xhjJrлb?tYɴDToV\'g3ݷ4Ma\`T h紣dTy&+nmC.p/>Y_S~+5WeTϠFR {a [QϹGo [ϕ ꬘Zpu q^GGW*B[R?+3~(.VoouB4: o GfOƐT`l_&1!(,fO!b! ¥Fct0Yt3JmxbV`bK k~|eŁ5c CО>>5YتP/9"{ {h;'Sw JU|zZ YbENjj:(|3`c?J2BS5'~?1դW_ŎX3J镛 thAs\xհk{7Ӕ&?=Y-~(6++4j\u Y*Y UQH q@-4=C}gLˁQLɬ"˴eɘEKHP"3NiaKRZP(`JB岭 `R]eAttW!oIAȄv<ղ8U9JȒ .PFJa@(UĸH2:ĒqQBMVFZsk"D iYfT%E@(˧N9I8#ւGK8)!HKF}*`ZE{ꀥ QTUX⑊Bx`(b(&)"EY")74pZ1OЌsb-E)UfP2R4z^ǃFYR f( x/ V| >Qy jBWIxDa^M=#aP+/eǶħweǶ]Jf`:Fp|8'rI5VqMPK\T1%~Jht%'P-B:~"\9r{uvXƽ 9Qh0;ԍ#$Rm%;]*S =RR =9zлr`aRC8d87|;y;_Mi;Pw'^k1G3w{=Jhzc=t|5Ҽzn€x?>-gZ-w )d/ǽ~;dɂ~`o/ 㼙Lۋg3^!ho.51Qjv2ɴ:u&0oi9\cxJZ@tPveXlK mesV@_Ο|Zۙk9w}Y~ɋj_\N% ;k5DE_/1g uIf;N\PO# Cs%|/7XxW8\4a+"|^MW :dC51xAX(2jn xZ'ng.e~IV@q˽hOZc){;ROh4EN:T`g !>l~?z-=xdLb-D00>xIDG(c]X~@e3N)xn\բNR_v" QBT&jY$B1bU:h-H:̘ȣNdDMUZcijkN2 3x2u=Xfdil.Z^ˬ_lsԗlz;SMA`jlQ htJʍPRhwz=:e_ǥ`T _Gd q髖?o{H nKYez>}`&b>Y'j5A,#*h&D:ʮ_?.s=MlldTT5w'ݨ 0!Q!SSGnX/̲zf3Ff F=W%d{ޣovɤ͜h}B4/Q? Жw MwI1tDɾp{Y|<ÑX^ 9*_Q 98f7h@=D{ Fk`dǴ_rFMf|W=)MGz8k Xd{Oѥ\|ZqZ+Z=lضO)<=oXsF3J HZG(@F;Z`AA+E{XueXC;un)UoxsC]j'ne?]Qfv^j )||݀z히jQ/? #{;eQ϶˭>Q,GX>S]WꏒO`SW"N6$`.r^o뗒nDrw߇0?rmdϾۋϟ/i?'He0jierȴ{..M;ž݇7),%Y]TNُW- \}nsޟk37*%77hcYuA3AE /nrulsɮ{.utrLgwK\@7=O! e"6tٚw⮽/gE_w6~m<0tov|pvBٳU!&Tڏo.MAQfڋ|Hea1ş- ^11V },A޽t=Rgˈw;wvLpDYW׽#d+:]Nzu8pg/b3#؂^/7@N ^N|u_N|N]q;|r↌}]NKM;`R{gaG&agxhѮ(`2O7>4)s;0fy12Nn-tYlUȮ8k%4JL&DS #\RzJ ֙Ipz;zߙLc.bZ}eCeZ}oOE&*k=P!d[$8"IT+KPV9h Z}.TU(kJ9\vQ$O3#)@\ U]g;o|6|y>rՅ+ PDĐ] 59FdU)`DF,*bFU]}jֳie8LA),ȴlv@-{v;N=oS[ީ˛͗99HLQTϷV6! hEc ȭmh3R6E6ؗǷj`=WZ]?Vi9 erR}/.w,$HUH`}T*F| PsJ5%_pqCr1Ÿ{cq\'pɚɸik:K srau\!9ʔ(KT(`Ɋ1S@%u)91fJѐdݣɇmdDNmZ]lfDq%RLbl\mUu :jMɱ(Cxil3l(nXӔ_((! pɈ9Q\v κcڗjĄW**JX R(") CH'+ELr0QWkX}Rbdgj%xѨC`>`#˘/_yԮmkҮQՈ!ɦ`稊ajeL~m@빷Ao~AKǝ-hE0 =oA 8c:{v|RPڷ9fمlPUjW JTL8N:?3CGNt7)tqܧpqQ HŲ}h+ڝ⤊ '%A+ I P([{Ԅg4d|ބ7nb VЌ aTUذID^{[n"A/hEPOf}cs={#햬O˄ɆݚW!ZZNWT'bE}ko7=W#\uٯ9P't.IFwՀ^m&D+nνj45YCA1 Ju&$1(ԍUk5հb>4Ty ;GQj)m}3Se?ZnK: pf6ά۵i4~,|61˹JmC Z;Np`=cg6=F7Zӑ^o˙`y5yQ2 =o F5k')'߳OWٻ6ndWXzSg(pUvmԩ֛KR`l%'~px3CҜ]'d h| %d=$B7֕[D%z5R%!cSkaswH\_ͱ'f\ > *pj1b Yn+wG *T}-a!wލEb3JR "#FXdrDbYLEcL *ۈ:CeHuQ0؞>7Pê8jF6wP4fA н5al݁Vܝo(Z5drth=Wn] oİȳZI Tg&XPy=v) vMUޣz}ݫEW7+YB>zzu{|)>[wʾ$h:5Jaav"ˆ5y/|߼);zO?hҲT<}~+U.+|pR7ڎ[װc:No+ϪH`{3^gx2?zDtJ&++he%C!G"]6§M˚{6ȑؙ1!ci|(EŸƟ˼Rm͊łd%WҰ|l3;]Zd(ɬI3Gh U Vn? eSڥqOpw 3"_uѱu75~\-~q4wѹ)lCűdN}hNxǯS#ܐR۔n`< ݱ'w#1C2ÿinإA)P8Iߧ~j/ۻ2 "\%V`r4fg hcp;ꏭ]g uYwu$r蠐O EϞ@-?>>O?GA?ta4%Q=&~g.]RhS='/[D߲-}Xcf#ʈoV}{.k vZB=v+p+"[}q) m seKؗ/<|g?&/Q٬7e<<{]j zn0SB5hhwí)Rᔶ}У T|hsf¹rM =#o/TaQu}P j_. Q؃@>Xb1m +?twkݨ jTw .[j?DZxg 98O5ӧvo_v<ݳMg 9➯fx}oףQ]d =>F?'8:A=Jd;|}LZ.Y&R`;gbIS(us'*kgtՓ)Wb5dvrFnh0z p7}jG?Sn0p,8YAu0C ֖BRqO";[ԃ65j;A- w/eMR),_K^=mVtVbXg)  @*!1q&D[F)-\.L#)N @8?fW >#̮mfR=wV_oQ)eT\*ծꇧ)/A>~z g r6|:Gv:W yZ~۶v oݚ+ъLyfjȥtV͡x ;ႺgUf'qU۾lviڝ??8ud>_~Ƿ|[$x>~Ě5[z VU{yի9%D$df sXgD ו*[ p!,Q)LQc6ϧ-*(1V7_` ڭp{.?\+6IFyh=xZs,sM C-C$RBKLse0TkkCPZAS?d(n%yj2 U~f] ĺa+fo4t* 2A?ſ >;MqI1`6!y\{W!Bk6;!ngjP]߱KpV[^TPU2&# ꜪiOӮ#W\ܭ Ǒ=SFʲ]Z=-Fc$M)8L)U +{Br͞H[j^4X-}=XLu f*QH0$T+ m<؇U)_goZmvYSd8/zqtƀWcšs\<Q3(XrN&m*+4Rl<9o0Wo reokpEUٲ7tz|r~kWe%4^{+rwt%7Whi'fH4\=NnPdgtO$L2Q4D+Ov+RLCH5{[zz*ϮI"Z'X] bmcS2\,2\DjʼnpmsT  e` %?L>h<~o׫A"i Jnp~w9ſW] $tD;'Z&t'-,gGͅ\(CUA? 0jVNәs,c1B&z QLbLtr]p=Pr|FW-`w`}¿1/n٠v"֣u}DbO%3O؃<ֈ;vaÓ3*ِ6t v @4+&ٯF4%Ѷ{jlʝ(Ėssnm¾h!u띐+Wq 5fB˃jU\ 34_.#>[ĝ6 T:*^jBi-e2;TIYJ,o'7:"!$.\#}o]!}w8;_|t!!K'qyrNU*1$p(JtpW#ŏ offG1ϯvOy3/4UL{wlǏS+0xU(@TU-R Axx?$! g5ؼ*S_:ӫuQ$g4y*_ Ԙ I],vTĽȍ3YˉQ}V\638V6wl!6}JܝaT/*q$\)6CXCƀ1>9NKimm5 kvjoJB$]MtJ o u w sV3A}d"jTH)"f=%w@*yC>U1`4t)CQoŘs\4 ]Şg!ƛ.,M;>/,GhF0;d[@s{4EsBDv?q&8L쨎|<޹Zieeisף21TTrL+dq.qXaL "O8MeF6ɩqI49];V y4tʮu?z)W;F{QW:n۽vC*c~ m1wkqRN}~ɈTMWRߏ"Mgnj6^kqyAkt~b[4߻3 `Z}4 :$L+!&˟p!X@8{OHT1+ZjIF;ze:ʡYo+2(ķhHNX~%.3G.MwM/dޝ;'{weٛS M Yf2%JP5dnQ:mǭ]+yf̾n:ѫYy5u.:;˼vpς'?ZӏOYcNKak,8\H~zeWJ#7ު!-W)xx0~55OXB\5) \f斥XI5Y7b6Gm Q/ =~xއ^I%cl#mk qwb_|F+v};-?B^ʿk!\ s&n(SY3ɥT@u1Oa ꊿ,RK}ՠN1ؓ9u9XY^:˶v O4:yMF 8 -n ܭn>4!AjiRMM)$DLSF:+Z[McE0) g֭A"W}NXYH,ʥ6Q"E\@8RF;樰|JdZ1]q4j_{esKd J^K4A|Jg{+-zQ[@'zSヿPJCRҦ+; w~G88Sm-V{/49A9Sk9ERrG#+F4SF`Vv!s`!\HN %pRAsRNP! 6&9嗞&WCy}5~Fdwݓݹ{=Y}Ԗ%$OseTH3HU QRb\:x(}u L)pNϝ|s0@XLX4Y{a`VDIFM!G6+ȷ/S;m[tΥNtv/$eWR&;/@JB$Hr4ͤ89yPޤ*%\N~vpw7>Lg0\x~xZF TB'v1*x<܍4Jn&C.5 .`7=K:E.afͫ"圱WV5^}‚4 +WB `>8G8jd9v8Rm U#rfy굿dɯyyݯG٣DFs9OH0$G>?He\ 3}Kn 33, wet>g/c#1B22;,yo ФCyc"1dag?94x?jJ7 KE݃ZҔN~T/PsoS/-RI.%7Qg7׷Hkv8c|+bavkf KAΕ0`[V_Ū¬TNm8[nӳvj@(m(w/eao$vT4^vуS:%Vhހn֡5{ BpGzpp7Hp2"m`X(2+$1&H|<(bZn`-\&64!iN"  <)@"x:R9Xl޽;1Or-H7aSS 56؛78j}#p[(YEؤH22$2TQ-tP95Ų^ѱ|}S+AM_g.\0ëAsEB% %n_abukOI=!|z,Qo-xۭ&c'H{x6*|BX!?l.17l#fyd `b#+.gu<#v/'/,}ŴyciAbPb:!b &_u,WAGvj2/}؜-vytb,>"vGwiy! ulF(B8ht(.[ .GQԩVP| ,ncR2}n:~"A iNˡ&*5uOv13vKJR Ĥ ԏؽ̋ңw4<n dXVAFcGԎ~8bGRž}%ߠ@#4"sѓod L!#uw'ӐĤ4i4>k3EUa~Auu A" 30d;wc5~6^<1!v}bԶ <s_+h* _U5e+s_F\ x^R]17PMI%RKWGKơť fkmGP!+oّ7;(X3H5KlIxϴsO9(^ )aˏ>G]xG%Q]h.5)9Pj|U6b7P0 ,j#1@bQClJ#B5R21ޫ6n@#`$4 2ȐsT!23sI  8(wτ A,u`25L)vsjEp(V(-6Fq 1҂8<]ͰH(eƕ $KgZ 3i/ϔFX`q6qi&7(Ӵ+~ۆNgGSׂ%y9]v+;ycˀ 1z(f 1PkDE)"RtZ4jCrB6KSkFO#us:/y akF H `ݓt+fܽ T\Uɚ2Ɓqׂ~+wQq?޾Y~Qj?}q]uGHybZ}^]Oa2S΍Z,_M߹[̵<7oVΙ‚쓅KNYbY-ӏ??H$$}[ `A – SdIc"s4DdIhFT,1Nz*}1v0q;d3J܎QT&c7;8&ހY6;/n>u1O?"p|A0EzpeFG|M4J*9J}.mHÈlXiIݖ=} q~lLesa}5,hJ =tK@5Z :xhۂ679an r[H&Q-c1%4yWc3Op[VF-hءjx[PB(p*l4VaaJAǭBQGڂ-i[3zxp=;HFWH8:"$z Mo&փ1|l9W Է`4ւgz i<^0wh=2wI<j hzpOp7(D|Zテ/{*< 3hWp1!^!As{4x8>ggs!wH*d} eMd:_nI:_$GK^8AI?Cu\5 7IF-]p "|YRq!I5W\ qyދ.F8s,F4κfNn&֫GK%ģ6DGx)$uˣHpDHJ$5#Q^O2R+U"s+/@39E9uB 9)p;q#ݺzfHa=.t!n,5s}wTV7!88|{PE!*7uU` a/ܷnԋʯ&}ôqN?]hPjWKuZu_> A5,~;Wj>UNƒKEđe5+-$&reXL?̼38ZXiF\y k޷eu)f%l0H;Gܣ^PHYbH% L1qXF$~_q>nh"K zG i9Z䄸) Sv@g%Q*ɻjdN,GvEW.4Pe($Dtu WBMk -^w0N!dɏ#_.W5߯i(SwYInu< s -խ#OI/4G A[ZSDcKfl,+1jykCly}gܸ,IUtc*=}HSi/T&MoZN։!7~*9Q&ڂM[Zo9UJ-$E Wࢅ<܍hTomN4WH߭ژĐtwJ{y u:]96r`0hqbk_p g{gyWP|j~cY/^XYW!D5aиw+C({ۭգC/ <*=o*+v;ЇUH=yu]C~.'O'V TT@hN2tDEMQjɑ |:ph /EB^G FꎇH(OB>edǯfRG;NM1@]7{amt3p< 1bB12W,:`$Wjz/(LPra Z-Cc܃[gw2X '-!Sۜs2< >笖A/}yeA]01牦vN3 :/ +v\#i}2dX"f2@Ñ̰{Ai%x3B$3j%LXe0fp4Q-]kPh`% vo\C&Ȑ}fuTT [tʴ0LÔIZ{LJ7>?mv!BIҌĩ<4]D(P,/b1 ǢgvDtJ8njJRssuiu.'H"xg}F<|? OYbDJBJ4ԤH&X'좴yEDLh O[MHp&@' A\;9v:`qꬋV;JiTVPܠfYg|M ?(~| @I0[ZBD*{go+v+ 1 'Ղ@SYC 2 Aiʣi畧4 Œ Lm(ҤWob|Ej dJ6%DGgpK*$ J,{ejC3zj mkDs.DX-  g2U x.*cxzaWS<w'*~#}R%$ &ƩFPtL5j.4as.#8ں).i6cXh8wlP3ʬERT@Y1iM4_c!L I" q!yнC'i!=!D44;Hq lc<=817U/r* L?w ?!]9J4jf\2f"/ˀ7Cv8o̪/w [mƔ(Oi}aSSDu&w &Pb)?)I)T1.RA;^ԍk%Bc) M s RHh 9 'FQI /##1 )G. uDZn͡Xz-n no7'8‚ٗO\n[s Q?=~=BDP |z#o/ixH'9K^߽zRVW˝wDf.Rr(JɵO]j9FoFWv6Zc(s%$_ i[}YJҀᨬFK~Wmmx]C Tc)FAڹRŴќ@ ~ o9 NLZ9 ƈć~iH0h bNcc֠ӀР Ƙ- -s#r7Ԡ& 8*qUl9W- ~cU<6Q^r42N.n _|2'zOSZ~F]09\$*"0J"^Vl% 5>֌sf!$ +q%FҼz=6?:'5Tד:n|nýJҲ6JwW}T{Sh.DzW!Rf{-7L}K?,`@MPT%妶 +w$WikC/WolL3b+.TU缕:NޠmZ''nKX#!=z'V?ـi5Askgټ2Ҭ{"乺y=GȂzֿi ɸ }:>:iGZO&;G-q5|vEulVeQ+4ti(9Eh.:thE!P|*9 g?1]#-P$8{uX g@iN"=+#y43C2?r;2Zۼhxvu?+gfߏ 9 9 9 93 n+|t$pIK@$i׉ܹle6}No]>JVͨBF??$z^?<\ Yb"z9]֋ˊ Ew< DiTpJR\P3V\FEcݍ/ͭ^/`SP! ލ#(廐4b]qkklzCW4Lj^w~1U^%z껯`e2eCm=닿9i߻;6H%JgBC3)gd`u:)ޟE:Ѝ13J>2QnvauyOizh ~nDH!w5L/;۬nCD q@hQDw"AWT.=0~@Ժ-Ӄp35o@Qo߻{ЛAP4A )%)̘Q.k겦.)?+hmXy[nIͶיǏG.tX[fJOVXscȯj"@]ޤ}k緕MQ̝gNow߾ܴVZ6NZFji1[v5<$Vֺ6”ڦ1MM}0 n+iM#LiVdUސ?G?WC>}r3[ۜ*,݆1B>\L4+o~:o G{ d1=\mz[.؄`>X#0iIt~n̘sLӏ88883c~;.e9m鵏8L8nzQKhe6"1Q;O]Ut/Tt =oPu5,٩DbS!ܴ@ `uo( pƶy*K-CA. @F*RT;-k߅_wi0"[*0U8Yye\Yye.5V;KV}-֟Dv"<D3^}*Q;θ9+Zg@8HY )d\ )L4y;"BٵRx3Bg2 چ!9u>Z* 8(H"N '3f-QF`pwH% l$R炃DݿGr9gLQfHA4z mTRG8nE ) @^Yu<$)8"!nH5)}[mWTR-X2ѦmIjǢxA\[2WVZW'-U] ] Eo LT1ɹ1ɹz@ĀYe GKJ&1G 2xfyyART@ۖ hrt^DIT1k!x>J Fr͔+0v): V|JyhkPn@JheJC*]+lӨDu;t-N4؇zTzAɠÛҗpN6jvD`gU[ +'lV0Sck}39"x^g׻ 鵳gؾ:e07(;8{g/dv6_DhR [!OSID?P-8.w xWkqvF@1P8|C^Xsjq×:&D`y;*"K A\)]Lar|%?kIXzVR1WquVQsM.TĽtzp{OQ+*STk2 E+F]|{‡OwAU݃3u+8Qgi '\ȓ󦮨1P*KRL+fheyJYNj+^i)e4x5(hu0Hݸ'x=~+v&iL>~"_ \5+&Bӌ<lV,}!AFW1]|r1O_A(,7CKWW8̀b`jkū+YyK;nEi;LC4a{^ܙZU}pW/n^r9SJփ613F(MQ"=eFŽwF I;Ҍ+S[AjΩXwq. \DRq:*+eNjL[VMt|:E⡂>^†ۮΜq)ӮmLc$6RyՕ,8U?B<sL 1wZz#V&xHc=ԁ@dVW@m DE)ݥ`zR`qZK,y!f(~/W~qȮr5Vܡ^J[:E^SzgMu( Jp,h)S*zBlA:;LBGTm# <>vrW h̀값P^^3w_Rť́y*f~YxulЮJ뵈6iTibg_ DkmsF ,{Oο{Ƹvٗ|;\#ȵ`#ޗsD׀k^!E>oggQC~(%I麽k%˫qx$,9= [m$+wG¬ƣ&!Oj}ό}փ*i_jVO;Ek]|{ /h9=c+'9}Ԕ%ʉT)y,kcQO_5Y T (DB$M|gGB.$ [{c\n81lyg@U)Wp ="8T zo'8 Xx#zIx7RqbjH6w̪zϑx$L Xi}űn=81wOWRs1E_ZdStzp {Ǡ5M P4hVI{4Χh(jL=eհ濶kjy0-*+W)S|Io%E4M~D%_..ocy)><_|u^Oi޲Rz ']WJ <@9” PiɻUA+ 8E* teZH#0E jhCd:z$) ]PiUɉ|$'ڠ^MQYn%'\&4a3DSx̎KQ.2+9#xrɶ$1I *Y4dթ3h{vH'k`s}*g{+9w[7R &P)>x FsJ~i9FHyk 9MJm3޼i2GgVZ4EZ7k㜤 ǫY2vkNjڛ(3GT^`/&Uٕ]\X<ѓ~29 =M$ϯZ$JIA݇ 1.VӼ{eڳʝ>/ek=n$mͲX.[\nFKjyHDNRjmAQ//qBmAǝBQK5#̚\p׋`BuyO ·Yd9!Hʢx1:m\"%;dQ.6/ϸiR~BԻeThy Ǹ~KKnO\O&AlY#Y kh0v[ojSU1˩]=x v>JrD)*E'D"d8*:!4G-bbT遅Xk_"c|8RlzɳgS&ʯN)IoƵ` d?ۿh>)} &I-uIjͬ}CfM zIQg8|Ouziž}YҙkK;Glp7L'jɵuRS*JEbpI-yI5T<`UȶA ('6lPY< C,^%Z#TgQg*r^U2A@ R&S>0R!HA9k EQZs^) \\ŇUj*k3῝|\X'8E0{>R76_\L_xQ^ۻ^A n@D19O]>r;~⽜Syw2o׼a[ϙ>dL6J vO//M,1^ȵ5ӈ1?ت]"ۢG,|d%w۷,dF_n┐uQ*`f/Ը/QY c̐ >|B=B.6O C6Ce-&gzH_Q]OOc'Xd7$y`W6%$-xjRZfW݋YؖO_թSzK$Tǚp WkU'ml| fsptU M3tP`6i}NB:z`Ul0ph04O.29˘PqyJ1My$4U ,"hTĔrq.ͳxȘhuv‰҇k&K&5sL7G+kA ippSCjAKA es @$1g:Xl?C(c*a7K)RvﱆЁܗگM]_^( F&Yd[d,R+"TQFj7y"Hx"ȄP~R"8Fky 1758T=*pnV 7{Ύ;4h+SGp|h/**nHTqj7ぇ*V ᔕ,Zꛥ5s U J4MfMknwf0@$Bv6Z*ɱyM5,fVT7_iw364\ex6u/ﯞK}goWydo$ϖK{[$<.Ym4[%y=fw_/@5qBo4})s1[m~^Vޘe2J cǷ/^e,n|Mx6%֚dLճ]JKTn/Vȗh5YOM;nM1pY:MQGuhJڭj7+dK5RkX§zԐRce[ڃKwq7Qռ|.kj i80Z GELO4Mr285_]]x 2Zօ&!"IIH=g|cz=v=أGNſk5,A-s %Rhk7&U^=7Ŵ8|Ge@IaΦp H(amMf=F_n|}x瘫!esA, e\;O?"c'\H[c:Rړ2UCb@x}R$1!2ˑZeG vyAYXFNzG=RD-5<g!q"'4luE']NȽ){ %ч՚,U_oNy{=i ѡLQ3k`.Z5lU; ;Uw"ܓ N%K dFփdVixyȍ<+NN;,/%~5[ps=(B>SJimFHqtRW_rѳNN]?GOgt2e5ȕx3 xOWp3+fk sUmt '%2e\SF)k=.g?Y0zH,h*{r?`Tn-.珷'UvXAMo9XM2TnTBswz{2΂a-v@96 I;n'uUF.|SM#A^V$:42Ծ]YT7g\zL0ruBdG~\0tvE%4/\ +Ĩ `J=%w2#j' nIdua<N4hd(wp`?k$t,f9#a&#!*ai"wtXqyAzHc$CQ|P(PfDʱ8`=)@3 eqE ,bY* Jin,Sbm6Mac临.6Fmn]@<3d}TR "N p*oEݴ w"wh$7О$Jٵkq>LaɈ=łiL|f*Jʔ(ӟ?h,F*1k6(R6OrL-eℋ<Hi8ע:+m1bldgwE;H:vF.S8L)WPo# ILn3ic hÃĭϗyozyi]aZڶ;rriV|FnW=4R#Ve  ¬y.nD&NўD8&Ӝ`.WfOwPSqqyAr = :ԉwo`n '*YrcY[N(=m}s]O71 Jw1oq[,0(B̳.on>a~cZ"3*&1S34ROG7Of>o_}Slo\p&LRP"fq 8RYlmk'v͕ y[讄͗g ڣ6"_ \c(bA؞/6qp(QrϴСa/C0s%er'* T#G`SDDQ- wy ^tswrMn!Fqj!c֞"$IC8sOhqy Ta t}H4Ji~~8T%iɉ>Nrx7*䡳C׋>?QfEѽJp1l#VTS =$6y*L E(ӻ,QzX @Oclw v@Eͮ!N224^)XzJBIUF##d._Ǡrnp`Af˸LzlɉP,Wv=$I<6_4o0R(`+2qq}uJނl8=CT|\5]6*rl&{߰g  P  Ek=U~nF@r@e%@T[[T')m4M-aCZ͛f^bw yA ;RrLEB"&#"DrҘ@֒+C(afk ;݉wmfl{e{f/=K4C4MLΊ 2B{^_ wG=Q+ړZ#(Hm!jf9:f X]6H+s) .O^%eHb֮eo?[ÁtB^ڌh5n;egE[z(̧4ȝ rOՁw *gN8Zsr=_"?S) 6EN E w3I?UcJQnWt8 wt3Wр%Set5d JK/X EMNwdgim_p.KWR䛛@}J˵ }{N:C/eUsc*A{MC"w씹bc|WnLZfݏaBR{?8ثRe(}yZ5S*O > U&8md&⥥lʕ =Co85_m8hSPo=j/sz|paq[_Rً4q)Y"St-^dF&]ЁҤ3Տ#J2q!: m.h!EWYs:E1BI}ԝFv7Vw  Rz:ٜgРOL")?#;#C9~pԽ+G/z)nf֐r\t15ZKn㺽RqPKo@ٶ3Kc) nbSzpA")SfDjHPI(E)sw9!U̖4˳5M+ҢqՉ5 uLλ_E]W5לxE7n%Z9׫}7Qռapѧa GXDޏ4oFe@deMNfPP]'%zlIc`';Ѝ{ބ#{/ Έ<Vx;I'q@J86dj` Mv|i)aSO?_eOv߿|;Ҳ{q?_H*M4iD"@P&q'Ocq"?]\c݊nz'LE1$3!(4b*Y.HU&Wq$t1?Oaۑs;9nDU.зȁ$MXLDDd )}@ d9Jee,F#s@D3TsF$YGc%Y%FZ:[ -1gW?>[s{ʾfv.6)ګ#h9ɿY>\ OMDt ĀɈRNŅPYqrF"C QGӔ'IBSP!3Q dqJM vZ⯷͗}` к4F=ze[<JPbF£e,SniLu][o#G+_X)3Y{$eĉ-{-yrY$-YݺdV#d,;9@`T01Ir 58:d)Okc7qL7wq ZsK7vB/f*b~s}W*yU->\i2Mxb~&+rxI5g^:s91"Q}orU2BkIheN aAҢB 9{dHHKP ѻbL0/ˋ[No'r&Mo^4ߦ1GӟӪ:_Ր})c`$"Ҝn<Ś⿫GbT ف)qۯrSGE k$lqwd6GAtrsg6L7AW;5CP,KˇgF0B.m:݈DlGLEUp̟( C }C*,F^dEN5^i%;-)l;BLӕ%:{qٍ&Ӊn9^4 D6~ z>Y45vjSaeʊ6ӟ3+9!g{g0rcKӺaHKG@ b.D>).ur1w|{cf.po Po5"zTyY_G'jO~Na@haB&==_ v. 92(|N$RFsP $CL^( sIG;g\$sRRߤ˫~}o.x ; Zb?®IpnX Ox傎Ze4DQQ uFHeA]pK:LHaCan|{{3?&C,CvOP|T%4I!MG䧽L^Ǘ > ?u_5QK<MTs4{I'P Բ'xFCal0X>L$dǾgtW_~ [9ȫTz0!CTyu֛ דkfʁ UUv炫 *.઺ Ui*Ǵ1eؤP$]9li2r &夼kN' K[WR}RsL6h%q2rf`C﹐6F>.Wׇ^0b7t7y?;}9c4NG__ܾ??</]ғĈh/p:P֡rĀt.&xR&2"@bZm) >~&~WhbLFV2kH`u$v^)HgxJ(e%K*&z;.!DdXOALJ@ZB< \y@^a|Pܰ'Yt(Epk5VBPM+ЊuPv*FD\K48S27$‚Xh!6B%p6 z=j3_iM^ e:GYfĠIBжJ]"وpH =&* &i 9%mM':[ƄTXrZk͡iٱM8qFj*jCW#@YGvnyBWw~#f3"3w^M}e/ߜpQ=,T}~277]|sBjgz 7~p7onczLpf{H-9{ISRLd;UdՔ^[fx._84? x43G漃z?v-#7]}Y""da䛫N0diWg.'%+ug@afW/J8apu{ԍpڏ1jȗo碪~u˂^ǫ4*~>_>) \3;E-zjaogi٥nRsGt9ĩiZ#d~p'΢xʈ8Ҋ32+2S{8bq??m?Ol؉jnQ04t0?i"Ƣ؟%)vfQt^hSYtk`6ٷ)]t!;gn +n3WZLQz7MɑMvR*hruq/.;OrYz7uI]MSm]8(/݃XZ||eђCXHtv.=}/LߏKeևFzn+H_Z4M:mܺ6Ę- V)45tGzڂIF=h>-,vݣ]_9w+$+*s*8S_%j~|&zc<<{N#Tx`[Sf#=s2iRCx~~hFNh0A 0F̐^wPci88,0?=pZZ"O._)7&U+"T|6b&{f-c5V @DH=ZELA9]B NqӚ*u1])oGߢtݱU`*[}!d̒1 VO@#!e: [S?@%|o՞WⴖVtg..A$qŲf*UB89"kVkI@BH@Nʎ#Nmfdf7<412mbua fDDup*k- 2czڿ=.nfNsz\W|g X_|@x7 $Wd J) F+aj8y)(2$?ٞkյ) ҧ,Ԫg7տ6KRl#`pv cS#MlQSm㾼hwˋ6zqyV#ë5\d+񡃬@k7AZx8 P|T`Ւsr[[ ϟ 4jsipriUθ9Ea`l! !RCF[R`>m'~?3Ј.og9svҲEnb9(m褴{(bJY鷝Se2D;8/7d;ˠph$A1jI\.ߚ=䐵t> нia0pH2櫞V%Er\ٿmxp[1ٿ9yݼ8C*Qݫث oOsEVI0]{Cj`%@¡5AH%2o'i҃k9x˔F˭ PB1So9{v%uFN*{>@Q $f *ജVD yso Kr\70XJr2| Օ%+b!!e c|Amp |LAo,DTĕ^'<8ܣlyQV5WD|[!lb jY*'@Ro˵vtV26!? `><9˟&4ͮMCiratZtUF23 Ċc2QE4Ye9YGd WumY"JZH8aPj87$ ֓4:ɨni^͕O^7'ELj&E?7'Lz>rQCBDt>ZaiE ctNDrKgRtiwnO{(5 y齴:͊U8>$@f"?]o/M q\RY&!{P)޷7QIob7l&rpqVVr'*ƁhIJHE#ЌN)ZVkt3'#|\{[.\WfWfgȍޓpJ#=ww_\_9κH~P脴=~N;qX!?0J\ȼt1JhOdr'AƜS1G1 Ȣ"ֻqYD" 94ybe ů,q3)sQtOtǯBzZv>j iqH?%[;qoX#We's"d/4OUb].ڂ!eBSdsS]!O||LTvGr+ m`<>Se$Y#@1l)KY#pO@ sG=Đ$lhu/5R&$q'dWr$mtS*;^t@0(f#DG#}%e2*a,2>!3Z&1mlUkc2_]M'+\(Fp=[+uMZ;.:ː|s$/3!`(LmyrkG]q6i`_]1m8~;Z`.l?|d5F XtJbV)#ƮDRr_YÝ^eAp>Dl!AK42Ik(5C!9ٻq%W*2< :1y4tNld.~);,J")jAOHUGVbK\>eIEП:O$_P4}/2~,T6p5/~w44@Ou'JK"!L}~RO]yZ^ŋIϋFu+x:̆6'ԁ PzC$ ۿ^7IPRcGp +WGZsRo>z]L˚CO;?]~.?o2SԾmlPD-fh I%;ǐZ6Yʰ`攡 E,U0"JǤ".:.lΈ/*'}| Mԙ>Bb_T4n^mDCeAPW'-TLԹCaITK"ge[rZg9IcXkDc {4厐V3tfmw#P$ބ964cYGAj] !YWA(aA-?NCNΌYyt;x{ad}+ HnݔS JD1h<=*,6@.KuAzM1B B[2ZA[սW#q]. OI$ D,k$IJ(YL(FE:R/WE<+HR4cluHJL"L H`㢈WW8A r,eNs"SiaP8cTbabKiFZI]yi)E_B%cHz:vz68}<_ȸZwYaoAAck\mrBe]خ&pkXJ{뺔X"He10D(/f5T累 gKGA RU˺9*y:^a~]|yKix%{x~\x]HպzRg(1 ΰF7H nUJ{k1Ж9^?^DEsZK iN''N!na O`)]6; HgS9_:-жk&}$Mnq#<#JQmIfB?;ַ[u!b}Z)oUZ. ?)e4*Ĕa ԛy7ۃFRNryi1 ۓ@b`O_⍀FȐ#DB "Wq-C:34ի1v85C2͐BrgΞLJqfCFw(r}k,k`dlA%1{[Ups{ uѷ桮%Fv3f r mξK1k)eb*2BJ5Rn|WI=Z"FWH}\ԭ5RŋZˁE~@('1dr~}Eso_?bX*V߈Q*RMΠLfF 1L?hedM޶L"'W&W 䏡&FKNi~%,@MZܽӪ|y b{+}M8G}šP(q`X- #m k4$Sv+I;wNBF̷_|d1=8\ &]afԭeT>^==nSa}pxt:8Xwp<#q VP8-~J|Ą>l)D=9ѩH'N`MƛfpʾGp[vY׶?paT&pcM5\CBNAý|t7o;Uu#|uwn@Jw BnUc7i9;Bfurtd; ;)*t줨(B@Xؕ6Vپ5npj,ݼ |>Fu}ĉJ r *"%(IR "YJ$YSFc35X $,[ob2R.#*LN <0:ҥ.Uȉ/)3=~':x_ P2~,TN]ok?Lw?5U2˄:hb+Oi!K߼AiOXH&Z9,e{9kjߗ_sLRS `=0#J&(u0{K 9(&cF`k2"i6CwfrVFIDS~>}rvst5gr V$^M)*Fy$#rӔ!ABZ$(&(fH^kС |ATUwO-R,(H-z(A I1[u)qav.lm\pӻ4a~_!!?"Z޸2bѫf촸#jQNt ɚ6N2'u#mrҺ5 0 !$E{#ZAe՛yy ( uab49&%aKd?`@E }m[<|74C`Ijjrjkdm'=lrj-S"|iy|ۢ?"C"0 tIUmgL! EW{[w?>W ^ r^73*Ez5'Jt ,ekUVj. u4Y#10Gfo;| Mo~s5.d_=L?bרeqﻳvPk]@˝t)S0F҈ts0;iW:kohvNG=kCEıg:;Uڋ$+57ܯфfbE;Άwd³T,b]#CxTAl̡N*/MvM:{#Pl[L2ʞ: 9&JOڳp.Cxx8}Y/jMU񚂧xv N(y"0$ЫYJxNth T7Ksb6s، Wr}ɀjV7ay^/7;rlOgK's,- 2?2b& 2˛Uk2(`ٞ8ǩ02"0Xq #eP/ȏB%EÝ{Or`Ӡuх4Ar}%k&ѫ;3pm[Gn AԃRjye$5T#/GPRU5dB  7O0o#&MfE7:1("o(˼&K?5klm1\=8NZ2!2 c0T77[^Y$X4}$YQ=dJ|[l}fzpaUجCe4ўK^Ǝ2[8z]M,`^#_(6^IG_z uk0|աv[yo?;{/Eks>^^I 5/~E\(ރ w%>C@12rF ŔX{@kT3(ՀVk'{Ny Awviv~C9n.R4bd HΠ;&zwGo~ oNͭߴ7cZxss o(ONZlvC\XJ?cZkO YH5]+P#B2~ǴgM>&BʡVFK=3#T6 wtVJ'Vѥbu1WѬk  aA>8YFɽv[:x2"ڬMHsC(GM^F!pZ*k/r+!p~1WssqnSAL#J?:h`{X7Z-mɩUtݏ],R%r-6)U$ 'D:ڕrmSR7-yGX GֻɅ)hef?o/.$@c6FogMrfU[Z\]4'XI ECYŇDM$JRTĨ!o~Υy}x[&'&^.fmI6wb1P;X#ad"13WARbs)*kzC*dA@,cxlu`o&M7_7҄ݱrZޝ Z\H:Bk&<%]POv=Tn @E?~OInOVIJfOg:8>! &P[5XS+,YfF+*V6tƈא0&2h~0JvT=Nmvtٷjʤ' e"J.e]" V1BXmh %r/[[xeDOu+yFFM\0}%.'Oo륕eΆ$iIute.E6JZ TR]۹Q#+3W EyeVL#LAʌ.㍏mY߁]S/fx^yGתɓo-íTMg~(bZ.xŧ+E"}pXrݷm ߜ2 3V6aCIL;=gKUwF'+VsD2%BJ y*T+ǂUH FelX]ngX֍P:AJ2@ cף3W{J:NB@JmNltC+VImÍzPJyx1=AB|2/=umn篴wCqLݞkB qݫ^fRu'v%8܅[(8";/A}׶Jd%ѧ֕DnZb""#hΦ"27}{咟]=k5z/MX_W! !uF^<xP;1X[rU)9Yױ\ %'ㆵ+"k!2 dXI|hF )w@4Pل2X8"RPUsXm!'5%SIIί-\GNc guwo:+y`(o@x ᦷ^Ӻ[3#sOV!ʎ!auӑt)$uJ n-Zs%7v)"aɅ`'H/MڐeGZ`v'^ mcpD7v/XiXomqY4r&X\[ϯbj\Q ZI\/:QB5%RZ4U9ꢊ! F,ׯF#FzBu^/7. /}<_گrl׍t,9q *@`!b JM Yak7t ,U XeG߀+%KӲDE*TMTf}\y6Biq9LZ[7>AX|m3c?ve+"p3=m8#r (j{ M7Z(cM,mǁFeh8UFMJPcP}A-xvЏصmJɣ25`cnn}kFcy1w=nG,%!|@ĈЯ\JWMr!9Cp^0VO7 Md9flLRہr3+=.yص U"Q0k~ z1Won6[W,r?KđOc,Mby͟P !lhas8$=#zZ!XwF[yc=fG8SR-ZL /,SH2Oc܉2yheb,mkiA:j\ifq0il(ºy1э28}|x=WG7ff' NL ,n?w]LZV .Ҏj,tٕ; C,sYlcm MsY:ͻ7?lw+ڀ # g3 =-O^ե$= -S ȯzi^K!Q q5B dYᙹ ^wooPi籫8z/U3WY~/߿ٌVQGu8݇% `^9I@Lf3ކO.hW ]+Dߖy߬7@5N}kXʪ\bRN5S~<9Y!R31?/C\J<|hmi,hsRP`5 / oNqd 2| B"؇/wt|zӜ޾]Uaaw^p$DVOxZ:8jxq*ˤRTz"e[kI93FsD.FYJ4"R)fl8w\⠑ޒWw[ڤjmY߈} fH-!_4oKm^v^ɃKDLwԔtc)s3.DXM><[b(uGxxK?PzI oUY ]3WWb~MCxX% M? /fqzY B+iq'xøb{<"s<"uBvȻe`E0_:m8/lY'ϛ")=欈:K +DѫOl>R,GRB=?f5𴂭An#aSr GhAGt?n=]h)5z_; (Бy,|Fdg會CC"МT3` k/40.[ױ[G-$"=jFĂZ?T#m%1ns`ix 4Em -X %TvݰrRٰ`4C8N3͒Աzy PJf2z:RĿ>翋RJ3.WEUt;,@‡̉M5RT(r4M2ʜ(J %S_O]y% 6x{unzRuYǗ|r߫n75Q]ϖ֝Zuj:u%z^J(ν}\5Bέ[kٵ4U|ReWyr @>xŎ3yRF<*5Ua$ZB6,Z3(,X JݺVsN0X:4nHLۃ{c[=QqV('8P@,׀@ ͱơ80TGuTF|( 7!͍h6`AkC BtAeR--SNV=j3`֚K{oR^W4)f P h&`>ߤX"nRT ^bn=͚/x oM5V,hdp<rp}Uz 2WbL8?,]lQ0i;4yN>.~Ng oR{Ԇpl߼zL_գ_'KWn1x%܍؅‡hF%Bv3s\ 8 5PSHJCy}0!-B,#A\8”LȃZK4S I&x{9ٚѻhwYن.*:6 :c1aq4`v+#pi#-Q.]fyap˂Y K*':F)e&>o-*# (tL{+zLݾ3" x`L+$#S2K-3]%P"?KOq{Лw*3|'m?3l7)qdu,!5-<MIf6O),-QnwPCjiSF .`Jyr\6,ZA%S26Bwv*"}V>"u`)"るwux.7HݵnR|DZA`7^j@VTQ.9i|CO $*?U[Nr$pP2 1嬁)eHgrtdJ Zl@;jb*,CH@衐B`[f$fQ!~M`Xp--p뽏, ϐk Q)&`,V-UV3ސf }sn*]=>G!R/pT``f J Y(H8cT^ "Gj` .$x#aL/ 9'g6&L^ ՊxN,V<*{*ɺlu(i7\ t |Z^=F9[_s*`(.1Z5V~RaEj*.~^:4e0]w`ǙćUz[w|y (mg}mʠqfh(;4"E:*c;v䎣m׈u` n49 FQ'ٙ[5Q I]hu)9ˊkXz$B1Ux2Z'C~彗.DŽbl 7L35/Vn8 c頥 v)KT9*D ƪQ ƌ# Eq;ʎN]2~fvTѵ%:HxD5U"tfo+['6zdgɷ xj7y`nz^SnQgJ3y*Sݽ۰r<5L_R}UZ!<,3K䚸|Olv7N#g6R~Qjj=`k3zf=ԼݐJdjTn\H8RB#=I0n@/7KJ X $ BnNI$&[mBoS9Q&O)}·6=7bƲG mdVL s"/ւ/[.(1ĺxW|j9)։\+5CκvΪáFV%Pǎ69L9LhNG?ϗg\XHRX, saV8r0Ų'yrRr];IsR6xy:(Y{iRvd@]4?Eeꂔ{׻YyܥVhV]VsfpZYoq{K/X3\ fYث4h?&R?t[@\S 5k$'e)^˞FpkC<56U]l5(K/lkjK栈Iu]u)8 8~|%Bcn5ХnkQ_5\i$:Ǚ[e]/p¼s^8dMnch͗pK}VT,:cݱ(l8T9w(Mּw(a|0FwY<7G~M5oD1q{„B%Dk'-xptw{QP*4 @@\+Bjh1aP^4ի^-nnp`K0xAi״8W(R]-+ydo! p"G$rK [%3CǩZ|_)Ġ1*I3I9Gxof4NpIʇq.Թݹ+HY0DCJ ?DA^_ Ny'G! i8s7 Ү ]!^"XC].^^2aY3*ֹ1!1?o-% 3)^xʮ/^Aؙ>5p0 ˃MT Hq/!x%?r5$bZr)AĘ Lp$M 0+"r7tjȥ./x%g tvh};Z?Iur;/R[ DO"E?S⅔٬Z;݆%σoww%n/]uti/\ofOn$?~$ ?>eH6xg>5;ly;s x) _ߍ(^%N?x>OAx/q2)_}x6;#$WBo{?#(v["D-^~L~.=G^#B~(9+8⌗'#ƥũLL_$l! =ټGTb] kϥhi` :4$20DaHt( B2j|Iz>14؟zi67A\:„2H%n] n|%^HTqf$hWJ~{%ɻF mS\iTlCMuVeW q䪞1rg 7b<w߹C)v&t>t9sV䷜]. CH""FATt)P#IiVf'&lkqlo&]sCƁF7 BF}N⹇tv1I.}X{2k2lU Dquö^ٖsxcoJ 8ZyF(SB7]Ҿ%u'tPi6"JlA!|2 wrd=കbt}O^|yGe*]J(I3FT`C&/u~X6`0!ˆ-Dh?20,A}e*D )"JCC(|-5R%O0Qȴ4yk[Ve$\Ь@Ab A}-r a(bTa.:8L3>9}!H Z5YH|#9%}KB)̟-ӖmӝȪ#ڟz|>A S8A AU.Lh,PJ+ &/:za_ b+lev vB֞ȏFgĿ3@)ci3 2~ cդ;/ǚmk7K-Jg?mDVs2q'wV;NJ!cU-Gz`/QVB%CY{ r0gH0, 9#Va1 Io7O7އjm3oՁnObBL֒ =J7czduzˋʃ,II=|<\}ɾsew5YfPLH͠Qdwm |nЅ! (EUHpMU:+tJQǟ e R"%["RcէB %S֕{0!3!3!3!@ք(զ{``ux& &">Md7"Vݖ*   fi*:8p*8U ("~ge0A:[qŬvvŜ EpBUxm!˔ǻnD{.tmcrSз7\Q^,]Ɣ0=H;+P1 xPC;wkeWD=W/TGii-Ǣ[=Õ|~H񯼏8o?`oztmoWĤDTjCGS.=:>1n\Iֱӭ38/sJ<ҁuܛG[g^RQ l9.|n+w{U>7_^A*Ѭ_| X0+sYTsad݉mu3Rg n1Uq$:sW D0r'5&ݑ8xر e2:A!_$)ɤ_XO2s1Kfos1?49]{+>XyHxe&@Q& d7JT?m,{Ŕ)~'W;t=*t5rd2Ww9>/V>W"o?ZWұ@\vsY& 5jR-Jz Z-/[.hZcBjIk !g1(sc$^W/{E$L Ҩ@̴P}JqC Ɇ5sչ}Z ?3R'acဢ]k1y (]x {v>Xh%9tD0A"sy#ʻD<=>k ${ZoiQQ a}8D֎T53SB$B4>*?B:Ѫ6Fe{_j!A|A5fgVBY@2[Ak *5Z# |@x$`ٖ /> |nc-3޵+D̐փ,B*(`9o^ōB󌐰 _PoD׬_[$j&Vsr7)^ CzY,/޽%8ƬZz CH^ 5몇)ԧ 7QHN{K情Ҧ.@lB ܵt7j8!P49lxտ#9 63ٮoK6/a*qū *!>{gfg[ Ja9KγpwBgMSBBb #]eR RT 8SL;R ibe, ])|'6KXQ)'uo.PB+5o,Ѥ9µk$"dڗRYARٕRG9¦mv~t?{O]!_7ոu3&U}10ԡϏ b @@c+gъH߽uG/&YOۑ#k,rmINt\_#xWI)d*c|<|Mo3>؟zids.?VpBt^ QEmE F٫9D3٤˩k_j*r:D"DHAA@5"D: 3aYqo`{EhZ?i<37ź 0yqjm%%*V_ tlF88KGqI"PKJF+]NEҵ`<2ʒ_n<Ӈ bbf ̼}?ޏnFƎ` dh2QP2rZ>ae~O޻nFEHCsn( Y wDgʚ_a1[-@v=ή{^֎gԢdž&HE@XZVė*phC p yLDQ2Vh=/|%5*~4h/>8*` Cv}@‹xkg,}1U.JiD R>FvJsb,ܰ+f@ThjCvO1zK1 T>"zDK=>8+zCB@"X΅@)ĈR|>Θ$0J(\$ JT~r$: R~f/iC]_գ%.kDNu#4'H zұ3~& &IWñ 75WI")Ԁ[r"l)gQXqY(2ơ\`Ʃ׻J2CY7R \ R 8:CAg4p茩e-❨(`t S#J s䠞\= ETR\ֺkc Z9/\;=h <_>;g&6I&I -tֳ&z[8oKN3Pej24"?C`1U^Q#J 0ؠc}TSzW1% <@pF(eA̧$!K:FTԡg}^g{f,j0Օ5s|`D2H%`>uqOj\O՜.(9bZbfPZ_xGpNoq׈H-H?)ybBJp$P0}'5(^BƈQI$$F (N_B@vծ w{s}>M7g-y-mcP.5&y4HUd#􁈜@8F3%ANQQɕ$*8x=7Q'D*@7Oϥ T| Ph11(˨F@_t0$[4~<-(N ~'*PLY^R"/"ڀY,Vc..ήgZC=pjTive'1abIHd&`Lp<"mL Cb %L1>E?i Bza^+hH,|Xfrkۇ%˷f FOP=W}s] c1޳G,V7a˟}a> h›> &?|zn|{ 2.Sڬ~&]se˿yȥߙ5-Kh J)NjP4o7|ߔ7U5+Xf0V;凯@푟ֈGZ~;i;YM9䎛jF$\/?Rj¶Of] ҅=Trm@~`^XJ۹}*lVVI`/&U uO#Y0U;k~~g~W܇n+BxkCAU?PO_K"`)K1Gף_8G)' f̫f[#Peh1KC6/>^owq*{u༺)pm:OIht_'ցd=@Xήcs5],3#qʀ3]?mzRe.uT7n ń@ E:G` `ԣȄ & <=j+&)&I0+IU  4'\ՖAx IHv/R .aLS)Ȕu B֓D $i0xF"PdQs;!cAJlV7E -Iΐ$1X .OmLp(CYdJV5u S@YR9͓ᒍ27_*\*9S(Q -Uɣs&:lw*srT0jɫM H^&GFC~}@(#) ;&U xr\À+akGV{F̾g]+5clx\KBPhhw@5%҄w(_ _i/ yNs3]at ϖMibdŦ%(.61X1'ssJQC.K@ޮJ!חX[Oz-.zu/bR(yǓAw ޟ~0UKW Zg & ӝ;_Dzq~0Aw xU^ ::W"O&2:*XC4FB'S CUO25dsinoKrb蠾r=zp&c:K)zrRǔYRe+/g}&t7yp@b􍡔J }0x%ZQsEJTYc(=$ 3Bav?FI6DtHZH&o@32(͍W} 8F¹)58吒LV8|rA%FiUI @@q2BMRpEYTgqA|.F@0][Ko*@Nت% 6&$hH T W-YJ+K+W)|AMVi'M:!kxU\>zwX&tl`:m; "6\ΘBUH`X^PDJ$d>,1&V4ny$Rv Z "0`x-f\0IgR2\WE C 暓$݁`>Gb,kQ [UJ "n }\~ݦ Aox$A$ '3WX\J،ydLAV{IV H9zUx?[% Hȅ;U&By!UL\FBET,!J B̡xweI KEeD{Q%+t:{z^Q5QV v}=N]+;ʚoE }uM|$Ѐ# $p>˧] !P! >IRiaN:,Gy'Kx]-@̼^P% <@pF4D$֧$!K|y/:568al&QDT%Qʺ$ Q1 1!;ґ9/o-g DQP">AE%rfЩ}D}@gYl hlVQO8xfm߽#۹GџYѷ?O8`D$~IHbtҮX(ݑq1/)mV?Bޮ~)gbS*Um$~m~l;@kK1V}Mh U T*[v:Z70f0r{V`w'5_<COv~~5uw>ϳ.*?{Y)ze|:a5yvwbV1`buw:Tý 1^m~h.*ڀ`#:yIuFm }˕OFω>}Y+V]`bgKb>i30,o( qbZu$!r)[-7vŠDtjqv;E:/{j.$O.U2ڞJ!՜vt%pb/ mgJPRBz?TkXM~aY-f~Lq+J9h%X&SniQk1.n竄4-'#Wm)}b)/yn7ˋp)xv\޼=?L!?}k4hY^Mf9b%m/~"nُ_ ^3K.@ne᥌l|}4k{9 {5]FBMNL6oH&Je%dl..nk95Gkt2y<|CX-"Š.>pt#UaԷTSuBB"Z%SǺN-&yk [-%SC)B :j.$O.U2k k:l;X ۾KK?ۄ6ӕȑ6kmFbOkH!. L6xǖIdߢ$ۭK[b$'\U.dJ⤉YJ F<%&λgEcM U%Ժl޸Q -JQbde2zN_sW=Xl5;}يۑ0Z[ suիWy?k Eகxy6[zV*=@iF|wZWKPIU'*(W͌mԒRD$Qs,WS"HP:c@-U왻P\ff˩̷TqٲJexNi "*/2CB"Z$SLN~A5/Š՞J;:"O%=ܶ_q%~=H_\Ddjߔy-ο^WV^y /6ϕ3壸4T !'Ou'WA/}G?(5= jp-ur5b-e2gEND'FfZjJ'xD5MI 0 j%:;䯔p=0PυW 4w=T@f:s̜ МjIިܔ+zUc5#aѬVu8aYQ"z@83OfD^fkY}𜬰YVhuZ7#m4|B`P`짡OpZ8?"]WI}iœRX4Zl{x.z؎ g߼Cb|AΩχdLF}+3+L9ұCbVX q$59͵Q<(&5):%)#Ү.xKq )_3|QkލLyBn#3>QZvu&eYi;GET*hWuU~ 7ktkȇon{׺}sȢa?ޛlErН#:-%#|GQE˛ߣ6?e'7Wg˖l`r;jp)IWYWRiګ喤Y};}_fB%C?_;*Py[94+W_9ҊG'+TD!' ɨ{{42LbfEIBq~ 6hp\&^F"mԚ6=? jނ]HC"6H4UG=DTSozX.[VDa@.7^"6+5y [2~ >Ix Z3՚˄( `~ZÒ 8p{6CË׹dCE +oOs'ZZO?h7}i%v^/|W.x ;IВlPRy⻌:H9q2WCWAkgŎR(@%86=.z^i hEpZKp_pbUVaY1nhM5EKň jjK^li\<gWt_ξ/کBdEZ2n<>&qv.1 RzɥV|!D{4BKHD[\\zF>Gs+:eNGZSD놕yU+ ybtCŞ6Qsv? Q.|ǛTWdHy+B ɥ7I&_yI##瞻)6-H$g&)R Z%$ў:2l uu="UM{+(ӑK\k1 P.bEٴn)F 5ʦq)<(@*UV_ǥح'{*iHEmDLpba)e` zeK | OvnNGΕLeeFSo2vsv>L(#^`$uI؄x ZT|r% dώ͓VZ9nJGE! V^X"Xb\.$d ,;L5taQds}1f_WрS}]&(7w;?l[&#Q _baÅ28)aԂBԺG15o` m^iIRw"`SpڈIT׻r3~W"޾jWާV-;+]W `]O_~q>E 8eȁ < jzOdΆSR\;z1-SB0b:Dɝ9JӟV0oy4D,N6%-qx{oou:?>;[od9*0tҕ~]ʁN(sk\=ЯaGOyxk]/>rCF6-?>8=˙bCEÙ3N4a-?Sl>Z9Slݮ3b4CMUod0~qd'iq3>cNvԛy_W.Yc :}S>MA$p m: R8a{_x`R]DN h '+B+vy 8)\ Чyq9PIWp²r "Vn5ؐaT)B<)d,"xVtxK@`k1S` A%. kR+13%LYG=5!0$5.<u蚣 yb*zDa_MCo[RG^pJ3S ]E \c fZV'x*mԠPox6%\Od+y2c<ōՃ7מ=ΆG#UkJZO1I(]`#&G{a*#$Tkƕ5=@`\f8}  : 3р2?IxFrvw׷8p9yf{{v+-H@l_KniZ=d_zq ,]ol~;߭xS^ \20T[3z@2Jb+~U>6T=\2 u2+d>IS]O[? ֞;8 8ZCNlsw{ꕭ/Zk` k+ I¼Q.俒ڡnqʐ݉캙&~`BЇs_rKw]N߂;}mW us sS1LV~8E7 a+v SjsU'߽hd*\L^'7_F*|2*{ 5 ;;eF% ?L7pȈSB6WŬ0o#$wg7_eY+ "_}WWկhulj'/0V|^Wk*p?Ւ)YcYQ-niEġ;1 PL-j5 PcN6˪Lס$2`"z ,Oh%%x9#s탒e`9ŷ ?AqMd2J9L'R8B2oQ띕[=CI@1dEJnOJd& ^^\߶#A M:T<,xp$-LaYZ1Emщ T} hY3%l|e))-㻡zt7uKkq+w\[qzK_(duGd<d9@4 hbR@(WE 4u܉QhLHx#UB s&~y$ָ:#ZxD$rV$sjbCޖK>2xbj ыk.]pcoͽj'ER)ɗ:Ǩ1pjbп9EOL>,/meOn?7K\dLi{ b6i`)7R@}OMHpq9[I-QaTgF\RaFG|0^DO*4ag"C"kӄk9%t D XZB085̊KĀaiJ+E1GzaMRЫ(aJ@ KNi4[ pĒVilO( '}\Vxegq|D_Z&&b+}Y[3vNKkKCfz-{szMPJcZߠɟ9#LVWه9?KlFݬmQ#*"gׯNGgM4gWv:rgiEjN@w?#)*TQg7KPq<-es1GAr@fgwGw۠m,.EZ e[vsH!#9*nVqt`ؾk]3~Jw=nec.qplXfF4NA%[ fVk#~TCYOxI{e@ɼm'͇biBI-ZݖZNӔFRK|̑uˬ46Rs[;n: ۂ [ũݬe?i'UKJTS1*){Zr_};xŕC٧/9(eP]'|J6&B_JhOA͝rwv# ,7j|3߯\$6ze'*X~{{?w<o=%zv6P>F=;IY&iA"E'WAY5CvreQ!h׏BH"N12+d ceYUѢK_ιAڶȍB=5h̵NҜeQ͕;D)L.Q 8r) }$hɩڴTHm5HH6S1)gGH`|/pkɰ#\Ǒ-R$eaܔpڏ졯(LU9夤M)Lt!aƧCZ$z0 D7?Ѵ=Ӝl>yA}y Aafϛ2r358/!G5 wIy( 8=.;PKYf/.l$p{_G0DY& &eN]Չ(2qOm}H5e_T {{m:c~Wg۞W߽|&V=t>̑jfDTu-V `YȊ")hK);;}Ol~W;s/|ȗv!|ɪ쓃g03JtC} 7_]O:'`^ v352DgĆw2 ĴGcv{C1"qot3m4#54iSbgSvT +?&cz;z 9.vf{e>9Y CXz:W>a>Vnfi:{ot,k[4Nw6]l6.8u~kv vÀ>ۜAŏ؝Ͽ|>xߘ+yry4ֻOo^9o[dB_u sl"wι.>:eT*grM -, ÂFҔfR*@L[(+TiWd*+o#)/:F!cn`4dxQM=5$=\F2XBX`|e4a`8Vhr<_RD݌% +'Mm"nUl&CXKljo~0:SyXN !p:}o7^/C/t^__Wo Us,o},oXxx )}?J-Km*TCmpc}!7\95+ ;bX!oϙ*OJg>XێaQ|!x˯;k7޳ܸ : 7HN@Oi u2`L I(jraOoB4& 8ik/ j^'5xC82YcÙF#Q0 dy<ڭxgi_8:'JNK"e- z:\gk"8"=6ޟ[cvN4;yeڎ?ͧ4ԹXDࢬADQwbo~~By)qf@?#aJĸYvo<;d6ߒ_@J)$Pi "5 i-#Y%aV ăFrä(l%5i@"!ޒKlٽITj FRwfJ;c1FΨLE8&nD]+Su_fbD(=jhh:ᔲbG#m5BvfL<-ttι.F<9Id&e.'e;ǹB0#=MbNbI;g S.\v Nzqj[Eӥe$bpCNah9t>q6=]Z'&1`*ղ]%DR[JpA{FpKRcC^,3)ʓ^J8NةIgY{ZUp0 ڮq::''rѵ͎Hu5B_Y+JCb%Fwu$>mV ߖډ`Y.k;ycQX?g_Ai<>ʁϔkþT/ӳogU sɽB7壝?$xNt[7oݖ\ynyj:|) YS-[Y@<(є1}=ߴna4-&tSޝuKV4ֺ!!zbtNɦbgP *„Pr[q ʖ K`ȭs ɐ3 n|#%8n Z˷ښ"l~ G{^g??/vٻyF]QŪMcQ(oY[y8FB;KIg(x7}ĔJ Nz*_(+3΢E <.EA<ж(y٭@<";~ }%0W,Jac,,O/ ܪ(wאO3\D;碗H8vpQ>JD :^ʡpT=|2ZńzϿiN(* \<}`xWe]GǸuw|z1t~u,nSOS;մ;̈́#(Dzg'QvNن]`Omvҷ "C ypԥ8hB ,\,o[k8=0u}њ9<=pӉ61A$=9BhH=ӈ`{p;6u' .Eoq]Ż[Áy*V) ;*5PQ(Afs[AVT*O>PQ χٮUv<qT+`HҒ"#)"Ȭબ  >TSp\m%ʍHHN]EZh8,!@Ւ @5WBD  53.AlfV-}-QHrOR0aEHE])RF+aNI[}@$:9SѹA gb+RNC#M FTaNJ|KO%VP W+iHQ7.#|)@`_~ 9k) 8: t9|$qI+[IZϑ3 l%&j *zp|.l[$;Ճil2K?}ew_LGqnrŭS`op2-Ow[ ta8^-Ճ`@5" Ȉë.qPK1RgƊ}M hZBi)*3$bJZQ8_F YhGr KIrh> L诈Wda33~ j+~7/Hx}b~tUXK}{of/q7`i:=xѥhC5yZ /rH,ZdW?X`w9YNYƭE o- jd"+YSWWٗCϟymN@󀊏/~z1b)Rs.AR|;wBJ6IZI6n*)8UYR! N݃D!ɅRI 4b[ 5a]Gl3J KKSWpZBT)HW"P?{ǍgYAX 6l/'ؼJt%d}3ԚaO_G?H,~U,ŏ)q :HMrXLڣ?Ǜ{}?={O^|JPPQ"J-4WuioBK&D|eou 2;L-8J!GTǂK46"(X0_=wH1|pOp|~C1Uv=J-pOAtzo bh@-?qK_CpHr\ UVY%Ԗ*4 Zkuu[Zk8/'|iBV|0JrP?q)BZ9y>b3],9 {P?}-D7 w߻HQlk/lds~ާ_wt%ƫj4D,X8*PJrfИ-w33gYc_@Kpr@%_pU+n?|\B8uPpx i3jZ礶`ԥ{: %9@5ia;TtJLI;|A+/_ssĕŝl^bZ>ΟˠءkΠj'=_bN#áT3-2ʃt"a A1k5蕣X&XD= 2rthXiVb=eo?قIKEJC*ڀPM$ɇ~rfaQ+%FH=ץHDFtBnrO|鯣VXܵTM*78!tNnVfJ}`hcTzЯ % q-64=EjYwONH/kB8IӐ)dIU>"iyz`9a..(OiWx!oʩ"FTue}Oe`jOji ]+*v)CiN Cg0&C4\l,3ZQ?&w꽧jӡ| 6si^s4P79;S)zL D 8{yJ.-eK&URU_F^onxǫw]ܽsdg5ݳS3TKriOPAmzI4&eq>Ta0 K"b2['d]B%V ULdoR8$ÕHXYRdz%nD !`eEy+nKSʰb&-sHeqdvBRvht`UNiUd!ZTH-)h~7-Œ@؞\]r6+33ٻWtumsciQb[Tߌ1 j-vӼ&EB81t^W-5{stsUҏCvy߾"eB vC?8ſ&brux-F2_ /%x8NQ,j #a#5T<.a򵤅'GR2{,o?`/MC CK}P#0Mhio] !-ZBKOK F+LYxa u94* yYm*0jZF"x RQM %,7z|~>$N{k'>`jw!F!>No۴!$~^ov먯\*avϬg~xN3`0vXĺ_ȪM4Lj1`ֽT+-bCix%~Qa`!$Ô/QTfhU.Yt&kS2N-&kiFj5KqZ-! r>!мcd2$D`~Zfk)iY1=Wj07ȹ5={T?`+b m8J@y~ ϗ"t4aHWn黄*\ v\nՂq5ТlޭCOP۱=~>ˤ4v]E<.h{ HK ½_ngBQfb5"<+ (b5dtj:+ihsca[!(>9/2H94Cg89"F91N[Ҏr:y~V:L 8t'*/;q>)ф*\2a4J8 |hZq^Y"sk2ފƓa 6khc(ٖ٪ Y6j0AA<ƪ/|ƹV^z5~$މю㯕l:b( H|nVƶ<"vDK4a|zFա'RqO6 ' R<8Mu:#cYL)h1ҏJ01,~My%n95RdshxjT-Uٚ_Jeqf[ՐC"]q Lr`7c㋁tƙUQ!@ }MJT{́B|=;NGyO(ӌ=$f}T(eK\H$b x2[/Q JN@ sDgJ[JRP 2Ta" TPr8(c-$2d{eVzdhCϥA xΚCD7yL6vC|~)f"o|N.srwOq>P3[?T rDEUq.fn g/VƥO.AhE2 lt9攋AIۦ(e2 @bѱz%X O.Zq86 r_`Wf`  og&6ڙ?T%q>Fy:FG,M|兊 ʬQ֠'?B1RzdDN a 9.Y=kKʚKڠwXy@ÃICE hBQWҩ$9ѝ"0/mCCYR_t(=#W`$9(# 4%i@BBN>B.PZ[!)C2x5̅Jsp!&j0m( VAr 6cRX˅kXQϱ0#JڕSԂ ߴ:4wWui 1 RcIj!tW4-dWb.'qzeSFs_No/}qV D/3kZ,rZ,i/o9z)7wfilU> ΃6(y$ҺCp)DB 0 *#ǹ*^_I׆^8_6 ;2wo8oͻyr..GDs;[5X[u$4|BVuc,oݳ{Vح[IH[R&E\Tjqs2FR%h*GA%3ygeVݵNVݵZs24|MSzS&V)T_-/{,VwK>U^*{N*c0LwMw6Oij0/߶ {' 6^.d Z"ژ019/^fc+|8+5Υ.37])cK`|#;ah]yP% rӿGwJ2B)]yD [R_Z7'Znߞ_-~}xßgw][zfں7Fm=<1އ5vX4*GW׆7)i~5\0U^MG_&5斃ѷpEGƊ)o5wѨ552B`- 6o\ C\F:['^O>ѯ#&Vudѐ ح|AkgPp8vg|̠2~Ptdރ3sO"Omv\G~~f^B/Xӗ~#ݫu?#;~ź@P_ A{x' ^ړ/^q烞i҇Ug/ 썺ATXR, 9:6ґ7ȬLH +_ЁIh )Ʌ+-` *(P*8͉U4^$MpQӸ# {0)B)c{Jqq<6%BkL!dr1 *3Du=擐0SvSf \28ǖ.C>"ƨoń$vz@=זkɐWn=BBÔ(ԴBx$cHԖVE4v2)4<%0&puCd*EofG_v+M']7Bf<`ݥ&ޢ~ӿxV; rZ\{&҃r u髭,]IV݉={u9'8jjsJխWvjEt鎦ݺi}yyfƒP)8;;miv#$뺩K)K@q 3VAjBu[+Y{ZrGq_ro;eZ1Z״"՞p "1T_([##Jf K]/b&GmR^{pE;B_~ʰ$_U4X9*$t0.G0%~ApWO S7˒Zt,O[pY?^ԙo  8L^`A\qH}{(q#UjBY!_U 8;Xjh=aT5Ͼ7E^ĽZMͪkn~ſWNȺ 7UP]oe|ջ^}Xm* A7V+zWNC[}/yel+v/;xi߯X?eWlo+ߺE؜$䍋hL13i7 Q-Fv$Dڵv/LnmHQf}ƥv EtrQG1Ψ7_ڐ7.Y2E=ƀ v EtrQG3F€޴[~+Dֆq͒)w_:_b4M;ARjqşm1S3.ku'Rp 2jd?' O' ?AH w?'d< 2n.~Ohc'ģ_0'?' z ?M Im ?8b]N +!Nԃ=XKnM!-.o6lKmu#>ؒwoB+Q~V*tLQ*/S#9fq eT/ MmշlJҋ1p2vdY yˀƷ%sm{w^7_Ɩyft}&+©_T/_g7f雪oҴ3WRuƾ*qaiOre SQR+iDz%\Iá bDsYh@ykUӂJ5i;SsFM\ u|l^->oX Ǟ#I QiuF]7GRN}m|FD|U~1؅9 vP7m2/8*K{. G?nW|?V/wO,*)ca#S)_ٰ0APIŻfaQ>~wf^Kݠ.GK)鏣d]^8ʙzZ]&$#PozGjf9F~eVan2p-_j0N|S]őo||SYiX+B:4"@ ((wAk!/xu)x.QAWײmƷ|l'}C!vغ\+ e 谭5Ez"lzt@w(́NRǁfXD7D7,'ao?kDXp>8TNh Jy/ WS_Hz<A>N@۠lOV<2W^܅}ʃT#;5)8Q޽8!$(/&gO&c\FGlm|e¤Bkrw[)_ٰ0AP`ԣ"m7fa5 6 `šGSa{ge B2cyx%{װq*FqmI͞8G%Pq6ϣ$stteo9*Xooӣ.Ǯg/HǣKXjwQkedXBU1]ߪzY>ixk|* k2Ag 4N;w ޺ ,(/SBB!it[BG;5)8QcgHxԏ7<ӥq̬*4Zw\^&4>PPrFD*[.MSvN*h4e9$39 `3뀀z*('x' VZ oxL Eţ͟_s1|:p\CMD~Nv[“˓,@:<Ĥt k/r*-IKPVs!J<:g`ԣ?NJ;TXDV6$(l) PL0hzcbـ$al2cC'& '$픘 9HiBym$bkO `Ju.KO EADbM]q$a60C'upն'NI4- a^ߴR))傊Rb\Y)EYL_:l2 P%]zg[NO G]b•+orlVm%sBh8@yU1YB-V!V T/QF.qp37_A +y.H=L 3h4Fͧڸ%Yw~Q<5n | >} 9Uhfn4yZLWUw77u5~'7wNwl{ǣX6{tfʙO*H1E9_LK%y;H[_} {5-)EIL0dE P%`aQu:^Ux7n4^\,^F_O+*  (G,toײQ5@AR=ډ_Qi\7QGSN /ƜJeP(KR:0,1% r~܅V5C氚Zϻo)?>WWܨ҆w<,wy&.e?‡+3JngAg aiz)!Ad NJh:ÉHr^P[vX/ZT[mvX+SFoB6cn!m 24a@3p9Ţ{"^Q ﮚZajb9ɱg"$%"%R =F*#9<h1xUh[X_^XN"OtBC 8EH^;$̫*2 N5u쓩r+wm]kfR?.:wFuf?i]ٽ ]eNL`6ZZ1v? z=>>,AvGZU&FP4u9ЈyC#uȼj#V^J=wkb5c`Hs.*g3 !ҚaA%e  )kb,gmMIBµP;YHe`'P44G񌃅 JBQE#U{m;Tc5s,9B%)*$V/Eqȋ}=Tt7L` Z[Z! xϿ Jd#+:45z-8:;֩AGz`J!Mo[A|q7%fG$<]GLz|ĝ4q&1>ʲ?ΨW_bφvgk40SggZwJҽ"] ,@7+<+vt E _7J<_ܗ6ۇ4ʇXP5M,>!CN\ya,,%ˇTj y$tCTI~5&bR.0fgKz\`TnrOцQNxwXC+a,AfKrpt<58g59*ζ7/0P5%\R g^ քm kM(cϑ/ac f\Yu\rZ 8o%Gݟ2*G]ȹ.eՂKQP>f;4OБNw%^eW%  gIl"%4`sٻ6dW?wqp6H',ZȒxS=὇s("ȡ8R̅!K+Иj`.3="н})H>cvoɣ з!㝎785t>敀\N:}t90h`fGGfc݌w٩pO?jЏgr 8Z.T l? kRgZ%TCB%IjX|ϾzL!*wzm vsW}5d>ym܄0ZizC)&(&w? &o>0f*iZgKӶS rJRmafTKxSh꫹Iq\֜JӠKs4:i<}0C˳Fa|D x@ jEA+>suQ.6,U @QJOƤS R5ؘ# r,iFPX#M!>jF@Qu;/ rl: 6 |F'4tǾl6VWF݃'O߄PERciQ3Z Kԇ\b2P9E'_4qeqQLPnF#YkwdUHZ[̱n s`\uj[Dt">x'|v}3ۄ~5!Awhs8e3B-n,+i1;}ט]r"yb PgK6dkA)IQ N`P}jqKVtq|5`u)2̉!JSrfRqX+j:065CVA[{wi[A"ᢽZ ݷ?CQ\X6oڭCG|oPXOۀ5:ீ9ӁZ =PX#:tԱ6])ӹFgTG81H,t <8"G|HսP L$0*k]x9TsN̫v@~(n[`OƹbHNv@>mK+i\n~idmB ҝ+RC(`,[O+hkWn4aznEh$F]-:o6Zc]Q y4ڝǏ yo L4xt96 YzeRٰx§&h0GFȠ`*') D>5T{EanDϻ~1z;ޖMҵQӁ.FJlPڽA(9sy9h~*~q~jc&ڻusv]ƒ·7@jp-F,nKidDgV>P3)g(bí+J4+Ts7]"͎eB:! Z@:!d%"b_ 1:!d!"]Mul YH 7 ihoͪnH2Ɛ! w0OkabJ &9$7j%!:R;w5χk&ٝ9V;saМיpSݙymuQ-L?4?f]wJp}-N% jff> 1"\a𠋪}2勪50ʴ λ s蛅i&{ SwW&inX&@6MG?AGH>䈨S) 8N:%Upmp6)&-cD;ńֻoE_u:k>$U-W,_BxsRJm{J]wx.Imlğ0M #g_671$RB#L;poc!{ S㿎S#b8q ]\ dSw'UAG酰qDA .M8@;cXPԕEHU.RrR\TS~&RnSg>y0Iq:sQ0DzCF((#8'YY. \FL0 `e5Z6lJ1ZJ=}!7TaT 꼵ZYIa[J,]2ʿq/S"o,L? O ?ߗ|{o+r0/ߏHSMf^J0^ "@x$_я?d:/s`Ls*z_`j)ǣk3mr3]zWgX"&l} 26.Pޕ$hJbƉZ{>SfMLY WnI@_)td4(Zpp /eL!+@;jm0f`֌oq@!K:ۖp`?81Bh?% ޹i~(I mY.؝*e՛:ѯiqW_ 5z#y#ΞV)xTB.0S_z(dGvd?M71g=5PۆVjhwUCF?2Ps./~FIj"ѯzi+BHR.oKgH:gW| X]#)<[[([JJBdqWPNRUXwډ^"X0Ktvш>2u3A* +P1ho"Gʯ_ܩ*y{/rc-2U?{=̲ƮϙCJYq&k6y0@JYttP7&G~@ecϘVRh!le6]KQ>uk@ 2 N)+٠Ca=diuT{8HG͜ IZ7,с9kԶ=7)8l{:ڄP"}Q"-ҕϪD4P)btP0*&1o 7K@_ ӂh^b*~Kj&m´R5uѨCE"4_jXT5VDFYwޙ37/WM]`3S)06$d`(.!yUW=@>5Ln#!L?-4`81X}j3[5ʽh5%TkΓ(]!|^9c-xAd)/aڼW!r<\fq/_Vtr K'$BB2G1*:͜xsoIMW̳Rϊ9Ns\12E.'p,XSOZ' j' $KGtPܿ@^ӄ=a88?ǰ a!/R`W5O-Ho䁈kn~]8~]=KMO_|F擙̓#/;:\R(S1#)[G%gq5l [d<{']H)dlR?1/0qreRΚ2,rjj=<ʈ6Xr`ΰ`) E%eT:HKi$[nx5&  0)vxWrATZL(_ 坓pHʦ Z)F# {`*֌3 g|)`@Y\>⩿MFΫh3z?5.`^,ӫ0b}qN[ЛEx >Ƣd:24Dcx%F0& l ẸV2z%aߕtrspO> eiZWYW(nL"ldf1hZ&d#>DF SJ A)fHi*r&F+(mгa'qhNEt`%8r$&XZd (oDDBp`O=XUc{X]3dbdۦV(%;][6+*lm{O:'d AgNg.9IO(ҀIQ(n X[bhrWd;N? б9}okx8^1G"^+?z+^o7^ T(7>)pe}FϢCyl6:D_}^̪ۻGvY mɔh|{_D-,kww<NxR,FlGP~X.i8.%Qx!Qyj2u5{m~fpJ47WiX)\wgyA7XhJbCi#/DH*+>Y~x`+h`PB0ƋؾBYs(SFyEӥ*@y,dyhfxV;;s؏2͂c m>8谭-WSzNfx:(yk$L9f|g/20Ӄaz}nIlc)C l8ho@w 3[$!Zbj5{UwLr1(;X+$v@bQuYlocwI==lėde͇W9SMtpM{ϋ^̾~xw\_l5߾x/w@O欖k2Zkjm}6>jI^̜ ch+Dү?=ql9ר(I]m{r}G;RsD\fG nMYkuUVDu>":͆` TGqo  !A8[c8p4(t{{-~0g(ɳ>h,L׮#y ylVNkz3IL!mZZn^0aHq-J+tf1Sr@v^KźC)e$T1pǪ, G+9FYRƂ_7?5}dQm٣1,!3i<g:3ܷlVHʇ"݇'si;mNovzN#б~3**Xa󺐲B-AْUi[8%xJ9j8oj__Enc/z9NImס~?v[]/?mF2I۷c\׆WXUL: q&$ R9 !ÊNw@6}a~; 54T ?M9cE axhs^Y%4"tQTPJzgUa[ỠE(r%TZV( l ӽtv.\Bd\Gd h6Nz|֬J `;mnH71z@nMrj)#z~ۡc 1ipYy{kj];S„7= S>>vY0$Kk'˔hӣb;}Eø%!# -OZ_RU]|Cc^TUUY.†D(_NX*8dtKl?6 ˄ML)l)^mct+@Y%u!6plw+#J P fB91 WH(e؊ӹ,(ZAًK^8h]8kbϊ\ 3F닺v%ϝ&qA_S!HPL\ pqIIv [x < 36&][YAu{4֨thNH6; d\Z=-n0j4{٥FͻZuZ,rϪE *&fssy^;O?r  l1/^x]"q&տ"%~24:fddj]a{NyRKz Vh}=]o,ꮯ'Y4ZSս[3S-}Gv>}:i#TyhKn}pG΢'uWsP}nt{nLLp:zwņ21n`V>AFc!>~` 7푏e@.e2^|AkVvKD39sk& Gk&p(0(Rz|'n>Kg+d; fDNTt,=ҪW$kBi?+HB pw-iyGJw[ZTok7 yry;ٙ}b`lb 0Wt4{M7 Ji5JWM? ڠ f "Ӧ@1cd 5IX%M I.4-mZmm4f= Nd20\YF:vbv ujJYmId1'#HVMB3tuG{:6&sE5Bɲtܤ xbܖ}:A#zd5< d _ l!N:t' o7@'P@{ n7\TW  ! ڀ}7=ѦTK@qAՖr2`~98s8C+IBfz`[Ni)CzÓt@E%;u',Ǡm^},n=кezbu|f6OZWa49ZOgwb06/Wi ;alQNՎa@6Q 6.zX ]4磟O/%S fn'WZ==W+SFo8Ne'%)vyeޓK=F#(L3e_N&>qdۜN_Ni׼SмѼ ["ᡸS{(6 =1Czh-$LWCNPӻI/l4 (-4mOgK}27 2Jv+[ {%; ~Bxkzʱkе0-Oz hPzP`6gAĎQGd ȴ[ 9q)q&Bg43+L)<3ZT\ٟOdP;~)Wf ա*.~OHjzۧX~) > g?!JDBg?'$] w~I <>?(g?'$] uԧg?!JOO8 IW"xOkDz~BJJJ1͗[{r5ww詻{+6'UT7XQ9-Ԏ~x]E??ɻ,[~]RW__bN^ iN9ʯ} N}7gpwc?醭N뾇?  pU8Ӓ  &vܺTE_ȫ-|*;F<<. QP ;+{<`CBǧ`!Ŀnd [qqWvJiѐ|H ! {agpr( _w,-3-=+5%-A}v).EKhU-|΢Cpk&3\r`3﷧CUΚY1HEJY!0A>aOC_.7gÃ\z\- @%IX.,^r dG!W~ۿ~jd>3T-#d*k7P^p2ʪ7"ZfgIZ~zs4x,FOg~>B'TW_ǭlānooܼβj3%>1bErOGM?ݭ/vDnl?(b0Qx u Ob-r@xyO0-$%ik0 wY>j}MvKH~ܡkh(P Idw7Ȑ^ݥ!7^ Q ߳HH ׁHg,GU+S"&c6y70b9Rf>EVRu+)/6(oFQ "[ôqk\T-"]7n0-qCVʋv=PÃ1p!L0kG w6G۾^?qyfzM[l;YU=dS*BM˞5Da^>F/-$* rJ𲋢Ki՞JaCĔĵ>ZO7hj,7X>63A6X/eƇ qK!6o-1@r`+e۹OtCw\FgtK$E<NlAqQ<~Xpӝd$3QFȎR ˡyݱ,eqDšu{76k$t?FLVkGюm^akߝK_[wBu^T!5%#z(䄧& MF( %u[[`zD5(SLa肬ͩaҌڨ2 !T,E-\ [uuZ`HR^keZhu|TQ݅MH1KԬ(ąc  \-pl D,`|cJ%>L*O d p:, 7-i#:]_ fWjZ;ۂ *fYxA[ZⶔɛtjMO7 R2J.zU,UW\@k7E^@"9>]_YH,H:B(jgox6hK'O8\@]in^G>B{=p9.n."(kb1v&qrh0(8Ut `Ptӕ[NWn9]ɯM\r.(TDB*L&h)sc*0(-$2%!"c;x||p:HCbBb$%DhN e(@\B9UQoPb-D &HrA`f"!rJ@N(Yvs8/qU*H^@-Z0OTC;~v8rт}#[ }QlecA$;PlߜvJ;K>YC kLz 15.iKGSJ`a_ۇjt{/+껌Uey)YVRR-ucm p6I-0cC uoa?F;20pY8$047uLzzQ.)[-y>UFKuS eF$uWU)zGc !Ok3Mvz[cCX=۲(!j((Wthr8V!dhq_.@Bv -9)zjM #IB/ c켏n6|̼xA-@D{,Rb,YEQ̌"22"3rm#d4]dB=>rz r{4Ax &^JeݓZ@Vb94me zS,}PDf'}gQ0boXKUN>1_*C PmmZ`Sw5s-R^oI"Rv|v6 ;ٯ_ǜ+i՗40AŨ4*KJy'0$09£Zn ~gV :hr{vk }N/(=PA=e$Z_LAĽ-ާOݘw(c$3d(oƜ{!'ΕRz;d]]V\9v;xgvqR=6м:v嵳9ɳFK}u-yg Y']WNu*3"E]NIZj|ܲ$5B :|H]⫛,_AH2z̏mts|$Gk<cp;KlxIC'61BJm{‡B spŸ==K.{6AS 8932]wC`+qgʞ\ᄷryk/0n;.  *h8{;wuށj:ge']-Ln0=٣%jpyX "U k:aJJ}̬W!SFPT^N嘫RS^\nֱ=|\ O D Ɵwkn)b+pѮXi X6j*x̋OayFbz&MOfW.<8a631dfLSij23-7Yģ9FVyx\PFj$B`Y8P6=LJxuߑ{J5藹?Fh?Pr_bK(MjetO~+%8n U#/Ov4]r4ia RZEFAK Z CEx66`U FwW)%kxcO@"{}|I~ԜLiԜLi9S,D1*^9#QR@(IWZFB Zral842yb,1"-M'j$M'?X5BW2\.y8*n\ zbXkcKb`b%=qiz gՉžk#Y0&æګ`.\ururb R^~rB"!Z'1D K^%'.6$A4C qQl+tịQ8b`[-uT0 ./n?^pbf/5%]gvˊTiEI*Ikd˺:Th eox}O81Lwʋd0H<)vuuM׺j(7`9<@\(`⤣V-\CJ-vyPUq%Wz=K+MmC'+"며%8V *0(Ta$SȄ&cK!9$ (`0[ܽl񨴇 3-_pTv~}Q\yaFa}LsULx!դrG?շGj|ݻX/U/):ywf|'HaclÈ7†Rf3͑!Q4 6l+o"SNP+ci")pTvGaJ=cizQL"nmsFiZSgiQs XV9Xj kq(8epf:v1Z3$O_x"-s.)ZQ`V iM#, $# lYH0kF1AKuknWĊJ K^0:E!ڽ"֠\ a6؁(Azk<6A" 6V+1)0!z*Y[9ZPIh3Ie)zy{yk:QP% ;en[71)*S$.^[tPJLDjMW(.=Kոc%' K5 !LaFT"JՏ$J7֑3!U2A$)񿶜=[5h:n;3)htBN9{ X =XSX:'؀aMA8M8tXQmӌGA|Gan0ׁb,NovXxY(N٢&4zISI6{|4G! O} ZK%\e]oH~L'`vpJ0yE(KcRZt~ KTz1QiS p91LJXXz1][5( $GK)]rV8{1ջ,D#G`zRSoxbQExTΖ00WZrR% ,`J3*&[,S=ArS⴯&5hN~?S RMg9‹9rniiH΄"G{؊}` 7 Y@cАk}z] ^J0@@ {;Ho`>}Nnqr܂xf:w 4n2vs嬓O׏n=Z|?~~'GRMGiCiWӽ38JN;h3{f'/+흑4-!Dg{ޭ9V-¨vt< |XTB5vw;:~_\̿~oÚ4CW'{;cbBX v#J fۇ R0kB h~s fr聞멧_B`vRh[-"O KTDwAћ9cQ!9} HL5x6 ՍTc7_"\ b;Z??S&LzۀP%ķg~ć2d:ŢE*ryp@?$}ڏR|Q+de/oƸaRKs,(O/> ¬Gptg帕tGLԧ8Q)VJ07 ӿv"Tvb3ln#-'>W%V.r-Z7.MdJc}rXmvkiмԴ?NKuϐN܃ u%EGZaznq9Y? x3]}3{xo"KwCdWZm~Ur4N5{ Z) 2(|Z1xAzVmam6ݓrsԱ6Ǧ87kSنqm SQtrg= *~js%'w B)9 =͕ ٮNGએmܣBwߟڅxyx5W$ɔySg_]xδ|MfBGy5N% h6Ȣ0Dm%1wwOjCab0_}8\h ){ O_'|~=M`&+vNq"PMoW~cŐs;'ZtM8ZoWrYv!?+s:a/4LלL;-RRjȱZVqՄ=o{@mƞdR5Q gR'"-Y%5(S MVWm#Ȩhe>Z22F/ +Mv4r>r+Y`\0J`VKd%2H HE-PJ5PEfF Ԃ'U#5ؿ>YE#~kپ\(nNLoPEt u0~tp>)僣Tf4ὙxNkB[!, *R M㢜x§ `Ev,=_\pSJjS(X1 Y5/v.wh@@١0r>YrB.'`Tw" !Uf!"].Yu/Q}E;03S: agݴ/k幤˚iY滋4q+jp#)H%'쒓{g`5mQ$QCZs+u+-*R[-wg!9=^iZ͢z0[2T.vllu{{̘f]ȭ>~ z]w4un-qþU+.%Z JAt3LlnNJdY0rlN #KݟgTx32DXòA1$Q3aj{ɍ_%8r bfeo=;M)vvKVKfi<c[f)X$Rh,p Zw Pp!1lrYK$]fC{\1v=20/5- ҁ1C;nF"#J)4{/%(O Փ<.nz6 3ݻmQ=L0t ^#(FPsJuJ )F̝.V7q:KVD9x3\A'Rg1F-犔%'dt^F!fn3R{mƥ%o, j( QpNSANXB\'pӂV_"tVjjQ1!=M<,|2,|i-K< I0D/?^0S;Zo's jǘb#qHf` l{ ZT|;5?ߟGT诎-ul^rih1:'Xs[%%{S7,LT!@҉ XNi4aIATn(DZ/-xd⤼I{%lVPK#\Q!niLI Sݼ&37T %TX%F_ 68EmfA*r<ʺL t`}X_-TȐBx^*DSi=,[hK˸FmdVWأ x]v@ t$֌`pUqF֦&j,AS9L0W3D vG)ݑݻAɟqeS!nEv}yF:!1YKaDAgY] 9YNu-oK12{355ZtSDZ:iP#Ks!t$֫$j"#`6}D^H-+F}+h+|,y.HZŰ:)zO-.79%idw1-=fIh>\sFD2rZuͱ4ǒܸgc2m$l2F@E<dDT!sbu=DB~(0 n 8M d(m#sk"lct9vl0a 35zixBX qd/ 5Eʫ3C-H,OX2$(?Z9]+q NFKLC'y4){svtaE|z;FaMPAgS}5p98s+3 gxٸ^*6*-iӫD0.-8C]ldWJCԾlQw1BJOf =gs+4ZtYZXeCnwτU=u<~6wiX| SGy5$ȫqrU+6BHmap Df۽A >,er4b \anٕ^=<츃Im,?܄A7aI7A|b!xqqr /qQY[[dVOo4Fh4TVbLQWJ%밑H;$3ϞL@iZYg&ylt۱!=г|^S&1gd(t,g66;Fe=~>zlxY1'Kq8XB iPph>8#_C Ur=( g0Y]wc1t; @ GOl c;(淇ICLb)qW}x4!&Х`zi۴ۓ`%)s` >M;Ǡ%0ȃ]}DZôA=|x &Ff )^|cuWB_aaЫEqy~ /7=gV gO 1] ,5Nqs" u'c=v¨CuG@E0y6ӵbLߗi"~;.w}-ZMl4}1҆:0J͟,Cqc(.c e=bwuc_"[p t!,u66UQe7^y 0;b/D-mobmCY qS׃z 㐄֜úlsv?\;`)<91خDSȂ"p UB:bX]drsʪĔEsL-l8+TA>%Y~SeHE-ldƻiR"CE{vж@ymFa2VQa;, / +MaEqҞ0xlZ >b|&Bcd2:n 3(J``' -ZF+gX[ 9L(Fi7<3vZ"rv<3&Z#VPǯ~FDJ}+F 8Y%zC{6=QtꊕKWONP2.aճzKZuI)4-l3Rͫ?6RXa™O9C8q0=> AٹYs]цu7d}'fh#R{G;.{yS'JUV3料m}+9A: {;nI+J6 c.qi_Ux ,{iOZY 'P?yyp303Ogo~5oMobgl>< ^snu> :n1S5/eL7`qD.g_?m*.b֕Yl \|&>,.?M a׸ǫ:/ŏq-yXZL}X]?{OldÔ{^L>~˛*f JjybZU|McofTR-?{SZ.yV닳 K6VOʶ"cuкS<jךA 2 LjJZpb0yz9a4p+rHadrp"!D*dѩF94P!׀A^HNy=^Z Cfp69\s>2Ҿn/DxPEWn)wX ć wݟ??=PY`? RhAD.,c?٢_7*,tqs.-ڬ_\>vx> GSs}o/>_c 2/)8Ò M}/ 0%_ð=>l8)5X0}fӾºG L^EH敫Ul`,ǃw](R=P[ӱ8l~ Ts##4K#Ј!g+7%eH,xCA(qӐc^f娻A!uB#Q׏~gX j n5,,4ː KK3\ZօDQ .%zVDp3Ͻ{.֔p*1ەКZOWEѳoG/b{g~_Tjm(Ht%O+oA$fj)"֭ѬQփq-f+ X D-;vɨsK5Z*aR[ A+c$pո)om&gd>עI恵7;9s-Χ!\Ǵ %"Z"ф``$P'%sRUQqcF+^.[ NHL}*P!101 xŦRSC p*)Kjuyƨ_7LJ~ަ >q_>Gn/aU|OE؋9̤OgҰLQ&AE͵B3_+2MQ֕S &Tƹxĭ֮Yyl?Vwcn$ b8,83dNb)E ԭY3OO_,ZC.=EïȪ +c53Б 2bW1[!&@(2bx,+3Re-%{]9C,[1e)L<Ӫs 4Vy08ͫ7}Y^ v7|8yO k0jܴ"6ͬ nQb-%3AMX"Nz,`]DecXJ=J 0-X[*u=\R[1'xR/553ڕbV.vDROԮܿfa!D lK=S77d}DqF{qWBʱdgFttǍ[V'=bJp|/;݃anP4Zi$}-f' QFY?_V+kry|g[8朝u qg}]C  BwUm *{1j -2c< B*-B$+Cp]j/ }:V@Zٟƃ(K.TG RΛl;Fǝ{f/tKUS&͜-m JBp5u59I!TЩэ#x]'!!1}7B6_Я,0 }HLV@g'O}'^ }S_C!P<05<*tiwUܰMZ= W|djw?w%#Zlm }'|D7+]$oֿ~[XlM1Y2Y2Y2Y6O'{1Tc- VS$eHuR{ 菏7ߧUM>lb{Q\a&S{\O`hlX A)Ecc/_]4N <ܦ%a*/+J}7smM|CEޒHը*nL= I9y qȽdUBǻ;e,Gb>pIH\SµtO%Ycȇ~X;Na:/Yuɪ; ^*T 0dB,xjphuD=E~3uw9ϨEy?9u iO1vMyV۔ge}8v-F_մ$R"$Л:Y:fv#IB+i񵐍@p]㍝2 *2@YP N8" &ppHHX"9 ĝKP5Nu`&Z 5\ꆑ`-qRcAI8F!x%% yFEB-"n%KM/MP63'6̯&3QBp>%;P}FxG? 3'TYR,rT9^cSОi`8C v@@(YvQL鋊bJ_~Iv/9#0R]4,I1lɎNM!'Ĵ*Y2sSQ6~xZ" $F:(1C:#8a6HJ"sT*FI'#: -uu-2/P=_& !D8)h[OT}"+d1vKMZQ%Zk!bJ2V@r6@gMMpc'g*Kt Oˇ`WQ@ac汎9ϝPJC 'P(XG.q8X+%r֙ E$d.a.$йiw'1%#}M[GzƖWK,hSЁPLwi^αӜ'fOO]ƙhrK*!yo^JH Ey"51c$ AAP`d?piPSR +r!Idҟ< }-CE?1ck`'0:ۼ~Yc,*+:Ar*bEP>qx*A%1"jI) bRZFsPoe4)D㶯1$C#~2~䠀P2HQpKa-ENQ)FHܘ!]4 6q[V4)6=WFiQ\_o9(P);,N˻Ȩ_tzd cjĎ:I/a\muky@B1O*dUAjmWox->b8ER|C)[{y0=g×, UgKsg?BU!^ יW-OC-O0AB%s_E 5Fo=PHuXP7_+AQ]܅rm3 -yތ7ԒjGxįgWM*5ט F?qT \Hx3jJ!UD }u!UuQLpu/8K8K8K8[`2!Ġ\hS+( Vx $A:At3𔌱fArrK;H"G*A'rDQzP.0``͜6uvmɾY5ҿf)cJpο=koBr<`kϝ2: ƶSĖc6Z_{,[1es@s>d(ӪI|+A|r@M OfyefYnk}#hE[r<1m~btmm_=$HVE~t5WC+o{zw,F_-p;h4s ƞo0%ت AlHDAO=.(gO@on>I`(\Lj;<`h]>: +~E= |3gPck Asf=:;b-R NGK*glKDlS EP ~AtI@U"SXi# LF(SL%栥B>cOPzDjٓ7: 8:^WVݛiO!%D!cQW 8KD%.)j')&Q@~$-b;]j #w+Y+(wޗXSqR{*N^U_H*V+{}Ѩjgb? OL!z5 7 tդhYo=N;,/Yy[]7lMqΰzvum%S}B^H1eJs_$90eU)S+ ~6KblHVIITb_n6-nRmeU!\~aծk/Slw7sr97/l>{t7nU=tTk<_.%c-yp'QaYuҩ`;\Z־`u$__uWD;X}M[.k U1ުr]ٓs-)@?>nP+=w+ tJŻ?TѼ[ycLևs-)B$Obc:w.Sf2[MȦӇލILK11wtn<#TmhV.ӻa!DO)fdnciK@+يE*cQfVmQ,-Fr~#qGNL,a&8r,& wķR/UeYjϟ o,r̆[Jz}"0=_;@>+goyjL?j\fD 6MYjMr{@x< epPH;KpIN"Jj~7+VH;w:^8nlL$=;ØbNGޅ(w h> *;L :pgkAJG) ?*U7Q"Z7?N8c_ŊJ$2fi웥7Ͼ.F(H T8 pń 9XA ܡcʺ8dň)/#DVEQ21!̌sИ Apgtr8`LG5㡺8vCIx8t9KnViRjdmoN]~yh@>EXbR Q_Wx1sJ28hJPVrq6@pQP~X;Ҁ]-ڍbeX[ޛhQweq #q_ !}sA8IXՌ83qaX[KOU*R IJd$7Ba()5f!^Ϡ_.};r#}YwnZ*{x^痧7h$rWW": 4?wg>KZ\ޞ_ ?sS?-K~~_yF"Ҝ=zrkr.F__㎗JbR90LUµ.lΟjݸŸs@L,>V-k<gߊNjb9?KFپX9y9y|fRhmQ1%8l 3%CT [ R+LT*KG[EmGU^h9SᎵ&0Q*ʛ ڦHJp:S'<(ierNJlKX6)΍Ǝj^PL8 [:ҥ+c{mH`3E:]ZTl8H+\4HXwHȖFW`\Ђ-K6,l}ri ·wfQenւjk-[i#O%@OVPo<ҧ c||qo5mcG(i!7g5Q.^9j@PtF[ݱaPA~J]q_{U8dN,=gDapHN&o}M{gn225!nQ0$GۮvvZN]]:CDŽȷ-G8X4oIQpbŘەD3`uhK}#Aol%ՉcOg98ٗ 6XguD-[,\Y%"aDKD2IЩ:;;?vJ= lL@!&Bz<k 콦) )\LKm0, E>6o1P} =~5lA7{fj2 ]L{47w!rm."Տr5 ?/jYZb>֮Q{*nB޼C %3z~WxPGdGb"7[^0C;Xzj b>7?"A_(`96/t f{ػ` |ܗ1HLGc\D5~ݗO:6^72wRTi`IGGk~vO4 o{΋=kmd\z= ힰ7s BTF@V0k^`_N/RsfdѽSJ͜:r B2BIP=qx}ww]wwAL*" 'f0A Tj01lJ,ɡhM(IhSߐ) iI6[dk\a5Ry3Tok!S#RvQ:/㷠B uRT9*m n+J9cGBT>V35hKݐckiYYO`TRI4e Bq[SV63+jqdpځ>EFء A]rG-Bfdu G9{LKTOX *PvB~2,KTi`s'02QE3* ?6R'-15`+)aSD[a. ,h>&3Er"Deي;Vܻ׍{JDCxj)]s4<ʍz /"=7cl`͆ahl+o5rウi!DacӺgÿ "%#ٷs[;vgk5<[6"'7ћ +k1_}J'l*w,8=C'1S`*'=hC#D2;S:>fyInH9a`(,+-c.G)WZA sRP] Ydz}з9g~6_oWjd >L᳇_OwWmglA6ųpkY07RD' \9歕wr!=*\qf.)2w-cnĈNmۀFijEhTFz1Wբ'Xkaۑec ec^Dl=iJ:oW@6 ޭcĔB{&+/3\#cT;[_K{ cr(- ལmwx>32vh+ܝ3A;bhc5@c0KeZMajCT[8"R 8:ܐgnļ[Oέ硇f#LFF=w6= I@㏙E{gn2r(QMym=:")ѣ߮rs{N*fQ1rˍ'bMC,TH޿{^̟)ҫSptzH/nnAPq9Et\zqt Bm7N=cJqVI2D p.zF)O[ʬDA#%'/7M̶? &q([ e+')*+*.D+& GPɜbJZYh0 oUpK*l1o ՄÑ T-eQb\8jB M]#OFA 2h%^㼸]sV}HWi\޺E]swzͮ?G8 &i,{4xiA:lr1;wi2&8n Dw9:u/o\d{BA8Qت*if r։Ǫ ' !P.u@>~= boQOxf(,](:P#VMGK>(ʂO>>Qy;t"MiHtB{gx="~D [;#G xxփ)UF?;X<'-q`oxyqf3rr#C09GDF#x "(?Tajp0:U4nǢD]ݕLPWWPX;zq hq#: QB`+M)+m Hs +؀)Dp2N"1) )\LKm0, W79iDDaGJUXrHJpRqDTD+C64l7o- {t,!AQˑH9eRM`Y#DɊn)5Tš2YL)*Xo* 'H@Kr .£gLIj0#>$qoou"i*܋aBI;EbJ_KHMǁ9>5ִ*ǫ*񋧗}QV_J|%b}ۙ?ІCpSjLR>?uT> G6iyk3Lj'U:m lzX[TE#HZ0-gO#gOϩX9ZZ@k|NɒS J!JR ֬4 &5VԔ+ǜY"ٖ v\.9rhZO0{vV6ɳ;zr)NP`xմu׊(z=G87]ߺgk;q#+v)eM&`&~l'ۊ3I߯%J즚'%Qh-$n' BW%/PT8S$N,d$ZN*K%<*sz[Q5%yFJ,Cs_,K|e2Hk3^pO,l1wH.99XBN Rv[j-I_8Z *#aX 'in jr\si9\Q4t>GaW]q(hI'LI H4sw)&9~=RP7JgtNhPVeC=ɌR! \|d1)a%5#hIy7J2EuBPv%<%LK 0#| 4lN2z~JL 4\G%)I e'VX3|9Q4f4gvh$P p檐0ysuØEE|g0;<3 Cj|Ywb);\TVzOó3Bd7E-7E#q;}}Vqљϸ0?~6wrMdkN4 {2cvQ>X>L~ >-O.~#;>?Pwq"6n+ g{4X@2jPrz$Ap$n2$QPot>P#ۊSBC3bAq ]28pnqJzNj^sk ϖ>n1$xm24Zffc[>M^ =pQi5>ٻ<]ʷDi$'[DR1 0  U=a!GA8=rܟhth-c26 X6rˆR@$C;Ǣ|8՚W+(@{Rw=*AC?֌נNE+Eg PFSJ"f*՜s.k:3Α\B0)〢C\w$(㽟'%K&Sm4B}qKƉ`aA!hDġR M:^uG3E3}Z?XT bB\ꢧT3*=<}wiWj6AEy}6uOVz˫|C{FD񗥄}dC_mP t?%8sx[nǗg %4!<, -Jl)e(ڮ{|'9r/q:TPB(ξYt8=O}9JKLY~`  w<yZ-ʔ}xw=TXNdNy裎C;Z;gW:ɜxR|Fe̔ebr;XF@fߙ/D.*y{97eJK?(_djukh|v f/b߱ѹݴ Ox>l*-9=X;e/8_HO*\ I (#depDAľv;-b3*ݚ.Q2UI8RSN^uəi7U& ᠪoDi&+mo6D;kl] 4GwE~#uZs5Q$,jdm.rJ7~8No>ݺY_ȇ_!nh23Gk||$?*k";~{;\YqO HSPp$/L/C 5Dif ǎ6j" 6+F.m{(}^S|x`|PmH^xt|vBMfII*{F ~O|鑋m$DJ:6[U`L[Mnqr?[6[xS[cT GySH jՂQEq3yyUV yfMRapKOek{FzAo'ĕc!>0l#9z;պKR7?*P(KPʷs0cw RY1 v2 W}I-$ZUw#Π-a~7Zvh!TK@@ [Ռ~YъvhKVI ]76J66z%C 6-+\o׭51zv@BŐKΌĝN*1esHHr2[Nk)fsÇBٴЌAR LNLR9 "0CC 6J޴H m~PzƏwwKooQpWw7S\b(8x:ff|_6+g'|WJX1[}Fp(-lj#ߒg??O6+ ΙWV@A'rY^3KJ(C1ޯ9SI{7fg˾ \a}9_q{ZuEIrfR?'go,/5S:9dW{K"(S|K [S%F:GRJ%ԥ RHmS̋ZX NN+t tV*|l&+4-X& ǬAKi VBk,eoG!O(B& B&[(޳X3$APN~\D!>F,H"+8n? =0WA خEբ`/DmX;\TT_?IG^x1" [P@Ր/X] pqBdID*ga?nR ieBQGl@ u[G@T_uA}E1ވ>;.@Cz׾`Ե/ZV}5É6_xp՝3p7MQ@G֪ Jߵ %xi5W1hJ?S@NR+Sn SkɥBhrȹȁ璁ʙrδ*,a"т+E3p}%aaNݝZfwݭA7-C [ twt gW,ܳ}w}tQ&/>YB͚Qvu[&mv8nCsvQ7Eu}]%~!ۊpmzo}D=bX B*nƾNm鐧f5OEa(6!!/\DdWHnN;ji#Lwݢ{ۆj$䅋24ܡri&q7CE Y(' eƌ#:}DĬLZp RSW@x ܇4$].7J&˲+l\k5]Ѻ5f:]qnجI@q;ff MT*s-WBCB3pNL`Ο\E1@_.W V~j9x')i_|[`S=OZU)'ΦW(tn򅽌O'=hLrLrq:-UǛM/օ'ЛHƺG`:s2i7 ~~ɯM~m+o_Um> _͛+ wi6rި.O<37wb_;ADoH=|X)a/e/JZ|ms~zC ß֣Ȫ/iQrM*mQKSӂ24YЫc.B2~͠/lu,ƃ+X60h (34\/̑T%'>xНRfT[OJni HH餰U-( WGhIQ)#Eg\\8չΦRf&* %Eθ+$T\ TTI- "%\Fs9jN]9 9h iP(jm&!+73)@4gӠSw$׎sn\A5qSـȲ P<-2P"Wۂ,!gPp]u E"gxD8ND;Aƈo Ryb@JSgrTQԪ#a}H7 $7u͏f4tx-0ݼ y"!SJUb3j8dW!$(u]Ty"l76$z]L]5D0܄./R V@4\!7M%X%/fF RE\x_%M7nhHC+n-5r e\)~2:zE}]H5U&%AІK2$SGR|ao}ȳ5"E^K6eu*{q: QϹ3b_soT!}KA]5_c%ޤ}Q-x_(t¸`bes]Q*- q@2U}i"ۤ)^S9P' Yy1P,ex/b]JAt]yuUj˘.R+4DH&sjg QYU8' HS) (!2ÏCbֻSJg ElA3vN<.PP$÷B'ɪ5m: Q!Q²541:x<#)ޏ?@ijd9~v&'ɸ Č9fI-A u w" ת}5\H>j7P=gH\ .qP󅄡J RC=dbdgA噏C"Aޣ1[LӭGPB 0yw9]:][qrOPD5U x^ D^ɞe~^$}xu?+H\DT'd5 ڜH _|{n:F bfz6O/iiob.RpwpxlXNg[ft+9|ܩ~qM僂T؅.~e\ZMLm`)!WZY : gn~sۻJLmF8ƍ33.xuI;€ۭ)LSq+Aҁ)oA -Lh>bޢB 0{4bMj+AW{b]+PJ ,>h ک}dwYӵT @7F N%NWPf|5ZSt-֤VjI[gw!h&@#J.? GnDth'p躃Ќ̽5 FL$[|L 8#v uȍ2cr?k j_a/ !ͬUq`qG/;@,3@>pw .E%uw{,WٖmJ"ey/I[OEXOF˲dƂ,sicrn`V`BOeQe>琎:j#1icnZF7A- uAqK!ʫ۟n+Ev-eAf&% Y6,l_/r*aR *ݬT]Ԁ0P9O %J8F+AZM Vbtb1foo!N 4vkYk Q#"Yk $s͐ņQ)˜"KJ]Hc˚B]) @[Kxݠ+R2Ez+A~ lo \1^ɽK^=śOȆU٠C7/ňRFDYZ&C#g~u3{<a@PXKjmE&S6EZ{99vT 3g sDӏE{&=91v2}zp؄ DA9&"+0)240YP]+ ~ Zv`\ū\|n3fb21 d3v:@ խ[+dU.\{K|~޸f&5w\ݦ˄v3˴rPt'r~ߞ+^9|{wWX*(Wi $4V RAJqH}[.zâ3︳/wŁ{mk㤷74O@n6nJen]l^;H2L7;e!_씅=~S `qn_ zomIA«D2xw擹L|(iܵ?[~ԐA\wy?;zĆ7s{ tz~|Y;- 6u ¹F қlq62OA62lWaܑ7^ 鼆c~:m~.YW7[ ]Axoϔb/펴1hȅH % >qU>D*"cG8|BD#RDPrFչ~r\KpbR4rT};OmjcΩH=AHw $FeBHnbu|k8WDa%Km|r>].Gʣm6K[COi #R+qlOLTe\% _ov`ΰv"UZ/F~R!,1ڬr}Tgz^ftV9R83t'4)}"OB ܤB@<"XRDP=yf.@uBθv|E^zS@նa:pBbXh2kC%a m};tgoP hw]n/j~UV=1o'Q)ХZH;@+/L*X mo] /Be*m5BRcVt?sx$C8JRuīR~in\ח,髭P ]ƕ@j/f{Z5uxk~R!80 !Z*Yh(FyQ,0\2nXɱuvZ:y٤HHJn/EdH4 ,E1O/zyzsx{ayu'Kp4kÒ>CC~dϟ'd~YIr,Z}ۗY}[J=v:tV釡UNR"8j,hUâC膤8^7.໸0}E5ey5ֺZ1v/V5nVU7 ,m3CEn友"YϻuA H݅GMpEQ=u5&(Y]}f_ޘZ7RA)%2L1"U_R*A[bϛG/#x+P ֟"_ؤGvm#dd -.evڠj4]օVyA-v3$/0Zh6 7bm4M{/m:sÑ6H% F"yCWB{ܙǚq犥FsJqm&Y"݄3s u8V f8u5 SYTsŸԮf\ۯі-ǘDT+XK-w/w[g_ SO1M}E6V (b,"-y<68j` q4\EFz*< 9R٢LbQRX2bώY)^1~ׇ$X_vm$WHOF_g7Oϳ/}Wx3û7w_t߭Ojtݭysbw%*wNAv%7ϯ>iR`^f9U(L2N3DB '4;Ɔ3UϠyC>@Sr.&Ap S(hF  D%Qyd&-nLB>Kf+)R 0QIMR e=6u\c16R+pΏk .2D Ɉ)<<,m\hR(g)E2 !(un1݈h7V'hDFJ*9qmJ}wmJqJG"+@ĂOz12T aBSkVqϺ~)mU{YaϢ"P6qT =Tw/DC[cj#@ l_993+xu1%!mdcM4f$U30rFR҂ 0e\Rm8BX) I>=XaC4}*Qù6YqI JPFfyQeVҊYltA醚SRKƘ 56)URŽNJgz_`Rέ \32?-^TZjK @98. 1F*-VepJhnn5B Bv! fmzW)УfbvWOϿTuDV)N_vq_bPE ٗ]d$ਭ,nΉK ҁ 4]z)yj@aK:}^%u!L]Oٔ35 ٭#Hc_|5D*GL"tͫkXidI&5*]Y.J 0X!ڻUk/4pb)E >7#m\z" cC!9qu(lR]=D#կC3Ȥh#*Na2Ne=0ޣkjGt=ס Sv[F9+Q}V; u@Jzt\Btq1 T*!1mQVL .b!;H*8m`S19<ҜJ>'lH>#xniXq;?H @*t057CɗQ9ReO 'q%v oLylqش^ gq*C)4~L0.}<cdmR; QVJrK梗8eP*7z}tB¿I(mlR֝o7ܔqC;ƛo|YU*G/},7徝O'n>fC>Gg{w jpSs\PE Ts-UaIK]0) PZʼ57-G_gQfVM׆ެjEQD* ycê_gRtnRL:s/d}}S}^g7Oϳ/0\=x9^'P}wI~jΛLPٔUWv^rq5廝u`ݰK[QZEdn,sƔ# r#X iN KcA&)U艑 d 00E`&"UO)z cJ9+D d1ߟ*ɨxE4SIۋ. Mͧl2IPxD"T*KϿMʗul [f37st< j[XGΧP! ՆiFWRy|<_jpۯ iu>75zL9c *YUA{εAWܡR+c&E\2RH#=(8c^eelN#P @csNbk! ^0%cشp*-*N'9 BRd5BOHE63\#PLabs[ sbԹd42X  cr%è<oa7ie\'f>$nml9y̿%v>4ȌvJ Nr՜iQ^rS, ą9)C .PT5ƽaR [ U;-Z(*) IJ`Ad61:'1KV d'߫JE 77߽yp;P$xU_:M Wޕ5q$鿂luì$O(v4i+l#+̪<4)@z&DxNכ܄9larVʰ( pq*; Vx, ?0sֻAD$&E[|bh~IgUx,߬e=5L@.`DaX)]9%8JK/Vy%eʍid ;!TIܩ.>[\+0xd72&+I ?Ԑ;rf(rf| ]Ec63êXbӄ1G;`KX`jJXxt`_]^b1AXT>T[R)59%$L3^4LgXZ\15F'sI,e,RsU2/?בNѢY2@[Ǔnt9)t8&2ZLaE\.n@u}eIذ thg᣽~?Um|?%V4])zM7ah1)&tԝÅXBg_5u #deJ FuwT`udg9)%Lge|fK1 [F^lyͦi$(`=/04ow'!LH__4`gzqV@ĩgUf졑R"z~;A/upQ^{Jp(ѳ^fu"[Y9UaL9|Ey iҧT(%IK?UmIK;f)ڛ 1,pˉhD ƽr) ҙZ1rlXd& >)G|a,Nꞝ2ωHd;D3 s#ruX~h\ Wux\vzc`k0G-lǶ,T7*ܨ>[ ,S}(NdI'LI$:jaG`:me`)W>/|gήm9ÿw1Lt#c8a|fW.kz_߿ьT):'%{!\9fq ATZ{*]9h`ʤ4 qkYRO$ Fģ!OpH\LfE6#Kg?IEȂ!7!&T܄bHE`DJ\XPD0TR0LyFyS8ȇ2>*C:"]EH}Ra<])diY)76I6Qw~|djpWw%BϹ`]j|P'y`tp30z΀N TӜ6BݚxvQ+Àó[_ٻX*PQ𑲛,Lվb>߆OYRc-7+g,@*אp%Sڶvc3j<諸:t[9FinMH ѓe `xļYm6.V!YAތ s?L^_8F6gfy{}q/ǣr`FmR^]|dvQ9Ƅdq"@ᰚm-,n)%J,xqX!#=UZpف=J,qVv67΢Xvl6:ٳbV[܋tȐ2´|SȺ6JIIJ{TהzQe FjqwZHN⥔-kUp$kBvr! ReuP#ŹZV*׾FwX . 5\={bmk?&NmgXjd²]>%\{&˵F#Xo;?:ҀIJےu=$ֽKR Lf3_#Ey=GK{}RC>2UٻZ{dݣ\ӚoJTw)kČ8?buql芌RF+ 25S1; [F ܺ,*Xri,3K+$0 r6y, SpDdv#aA9TgϰT;.vQ$%9bM[1c?O܉mvkϴUpx"h2![g˙>qǰMv̺Puo5pP[Ŋ{c+bTA9 XX(̏Y֖밎a|Y'-Fab0sg"ͰA,$-ʌҔR-B2MMr+%FtBFu&hJtFPQOZץ[k /gO /|q;=!l(w]*!C#LFlˡؘ'Ɉ0xɈ"ԬsSP2Mn ewV$3M;}`bڙ2)nQM`wJJ<ep?wrv+/n%&r_|VΌare#3,lyE>& ^sa'`2I:;^a.xaO$a MC,;͓hʔqNhxfN3)gl|~\i@ hJXH,3nxy;Z#O߃Xby8 )f4- !".d;4IIEM(Dwvhi΁HCsL§Elhny#p_FR8=3.!`zRgHjvmm ~ Fj帼 Lgn/x&7A]< P!')8|qYx2:q/[gdwv1gfifcpC<;8DHd1J.Y,n|} Ϫ9)xc5ibs5xw̧c/UΡL̇zĚ$7vb|V]R!-J '`Ĥ',9򢇔S4Q!/m߀2E;ˡl"P;0̚8ݡV]σyY\X ,a;jW-y**iH[Pdcy m.0P ˼4۝#DH!4Dʳs9+4< *B?t gp"$"$hENZo.7$7sbd95\,V!(^X)aSTy3oV{B`ӑ20X T*H6RCBr h5䰼h)"0 ~G35㕥񁽻,MK*I"n4X42M!ؾi+*1k=0X\H dDU'Fxho.K 68Q\{ WbޑMi HJp'"`#op>$:uij` b|億 0#:cS)ôǜJ2e7_ Pcd%dȀP`ˈbj )a!zf"SB S#Q0R%6`ZfqU.(p)"ShadZZER8"P'k U,צOq+k4vGc:w#>zvZC*,ZoE՘R]qѺCz?&PK1aXȝȭd4@p4kR{A=tO"9><%+^,Wwnjw?v& U9P U:5[e嵙Ӿ *ǞAk\Rp!o:KU!so]e*FpӟOۍrj(l=j.B ՄLb -͐􊤫bqkU<[ĪE9O+ Śo8儗Sg)UoKWUƊ,tteMjÐ ˪nơDT&]I"б&ko5E/ VIiKE%` 紷U^kE05հ{U-a(=`9EH[@ɱ.G|;!XfzgjbݫY DU"gf+/q~bLNB_<i Fב~ 'h.#?&[,,A p&MQO4V0f[FQF258IqF!.-BXGcミxWwn1wyPEWuiX7q?.o](%^1N)~J3NtsCP0brAqL%ywXQZڂ7oE&-x[j[ OIq?|?Yk#bsHs05GwfȃWV|C4wmm w4\quR[lRyJ YcYx8$#RĹHP 74@.=Ջ/:(_bZiiQ&⼣R~]PAM pS-ū=􏸀S΄auHnYU"X^.ї^0w eyd!zdžx X(Vz&xaHPt(%U &(P_LO BxIX6c 0DA|޳,IpHQ 5#A%Љ.7, s C(zʽHZc_wpeScVWq9+Mp|\~2_7_HѢ:OT#_Yn﷟&V*2"KWJ5(I=ĥ­''T(Q(Ic01dZa'm%J1Sب+%l3l%0ϒ߿sa&0|{F2(naD?]@=bk=PcKMBI Q:UX\QkFv݅کyY=y{64ЉuIG Cu3n \d@33*Lك/s)%T .ŷy&Mϛ}/ Zam,EDboL4DK6BxHI})w^}l\)Mp;LnfnzJqrDFI3ȁo4OHMʋ 4r0$40gTpJq45/Vuooq6Hf QΔq@=`Nz,/ju92-t݅is[ ƹ2AlQ h=8ڤ |%Koښ9,[&BԮǀ]g4aw Pg[ Ǝ#U?"2 NRg{]m•$6/ŗɳl^ykT U1]9V˒SAgvy}W^oN2z;vüh&f0[rAN[fmz\V+8FQ`BGs½),sTy.Q-T*%y9# ȥ̍S\T+-IIMbr^F3, Q*SNҒZN5CGՆGHH/1նdެfKiJs%C⑯eN5WBTT(Uv1 G酕kc K\DgQlt6ڠeNfQz+ov5)|~TX4P*|<Ė!L̹Ћ!lxu튷7ZP2fBw$zW16fNՓy S{mQ`u:TL2W`Z6JLL~w2eU%7dZJqnU +J16)pKY[9FaFxa?λK+ީ&tm?LvһiF)cYy@gjCQmP>F[%bH.ȿD/R"oE$IySz:H.%DsF/'GJZ'\JB=J^qbu .;@ްifƁh'MSɼ+/+d 0L]4,+c(dRBmIPQҀmz-9 FiJUZ47g޵hx}f8[Mbd3A1Bo`R> u'T ÙTT*-O=cdL#9̝W3(eoa=* }z/c逮Z*RE2PR$D ">2 -C eW^0<_ډW.R!P"WLjJo53JʴWjd338`8mhxv*^E֌gnx[j*Mxy/v,0 ]XΛ< d&p1Rp{ݑm½{n^jūԏ`)_0+6Sj[~$D[»+j1?&pQ50zZ 9HX>ToA0)9\Fm$9htȼal6k?>ؖ<歳ےǬJ~eedݝvvZK)mܓ,4;v|2lۛ4*tvTTt>Tk!^QS ۱sg5!t/jꍪ0.鵁w:PwS&xզPAOy:6G:<^m'Sҧ?VarnoSRbS,ej\\41uaU}1j}1n~em*/)KςȋU㞍<ݳކګx۫,wnlJ_y7kYMgn}ec:}hwN5fz!,wn}lJ..5|%ѻAw;. cm MBIf2eyg/!pPf> !gK53]3.݇b[y6[<\X0Se!rҒjϓ )'/ֵ܄>W7 `B~J~"7`A 53[p돼N!Y0GrzcAγ$gӄs=<RЖ=a-wm{1(~>C%BeE!)A!4`gۂ:Vő1v)'@-;6g1g irUb4iFo_m q,7Oam]iD0~r@I#)# cǾ{d0L=P Nccq CKP<yF33€'`b??s`xtApF9{k`ʧwoi r*Up-sX[*iްliTM5`mV@iњQm'6Fr8}d @a}}:lC3J{jVSa[ZTX :؉ mJ7qƫvQ1p!M2-ʹ6AJ~Oowu+_D`1l")Ydm^>0#1Qmޢ^|O=V> P(+:6BpBa@!o(d5> 7K ܟOh9 fmj%mZ3Mt0E]V6LѦj Po+PvyeWfv3-EHNJ756pv=`DVU*<$wBy5nʱí uEr7q>^Azg!Fe^|]ь?#h$T-[|a?㺔6}%+GUcQO} lpKyJץ)'rp?R䅘*J=S5 ]L]TײHE,|q,WWCo7o5u W7ܮ+tdZ)#Vm}MfsőWzeX*c"na9yޘHJ+ 0f<3\HpQ^roqŪ=WW)jSVd{K{m[-1~F3Һ T~n@;jQYFWzl㻛>i{S]vvv4+9oz0 *6ڦ϶`3\76mU[yEiOaekgQ.~ňڨ|B ZPcK1}` (Ĺ7:BaXRWBAb@MQ^>4kwi9eqP`D8:#A fwtӍ֒WPJ/64`:SHB;$xD;&srid0x*A)q@#O86xܺHJ]8LSU%lHʑjⅶGR(jറH-%"`6X{ɑ+2չ3#~4 \r|gf4c94#hŕ(P D>M KR .oh:9c*!FRGI=اB#M3-c׮.gUߩw$wxBshO&3({RIǻE50r=vB輨}Yht)f݅biM1sƹ)f2I1kQK4diKJ8tIŠ39lĚani^r Q6CC'%R-g-fQH$XʼnHVԎP`@i_, nC *msz8>RcLBZ}SE%n"`Ӗ.ne^ZKuxJA5<9cUJ=mY[!KmF;k #0P3™tV,ѯX1p %bո{#@0*PJGҴH`(/."x{!sv"Za׫5YVH&co[~fIdWe26S0Ǝ: 3_*Cx%/噅+9S8("QD^gGd䓝|]e6KGƸd AFCy!%Uxr}r] /-JtaHQk΄7)$]űno&-9sS]AXѹ!S(}5%;7g>@JYds^2f6C{Pݥӣ{KvW+)jpl%5V^+m%n.eݶ޿^_|1-m1SG8Zuǡ^})V'ؔԏ~sEN S2b_|/YةouE#ɔ"I| ׸(>+hI-;ycׄVOL%&˚LkDhR`CQ(ƇNÍфl\Kgi9duEODgD'a"Yhٸ9ƥ{e 1'7DHew[@喇}26|QɎԴ#0:wrkGVH wS7Aآg;nlȠ8t/#=с':U<ۧ# *=*=\|J/ wp˝Ooq \k1qB_նvC>w5NZ7 ےCӍDcnƥ1dn69ǐB@jHM# oҭع@x]?\ @1~Pvf́ڃXG5Z-#gڽXOz?Am=-ė&h[dv=uZ6F@ls.DF<#uoWF̂ 9ڞνtxF즓WzWa^:[1QATmII~w1g৵h?Nui?yQ~.eT s X`ZJ'8iiqгjecL,K+0[Y9z[<)=6S(#qo|vZMԀdEYm~}_8.eO.s>|{93 ⮧9஧ɏT / 9LTYb'Ӡp\߸SC"c*BMn2{OA"D>_.>rm_ gOQx&;mBgKCcv|!o+q}rKyc,+y Ojms|#[]ƫ\FF n9=؍"hfvbiqW\}iAt˼|bqKI 6! !Z.,O3£/4OT>H**"d$Hb!<|'&k,O`g-{HE=1v]o~ڞVAC_u"^w.=N)eQ+^ S:*XS#H'jMUWlNL-e"DgGlsir#)LnL>zŶEfkpq-b55eùؖ*"U.A1emǷzš!s8e6hN{3mBz9CDΐɣfWǟnHzMgB2p<X'Ygȝw zFs621 ^7nLxU뛏Z}DLjAdV^6q|}"$f%+Y]NgJn7o~,H;Ud5옙9BQn î(48O12`:1ĉef6CdUCpC$*ԁ#Vg%`ʧXyJqIXl0@)|ZP>h %;JmZQoi2pIEӼ(6M.O+1!Sy،OB.=} 4h9zZ{sN;k,/;x:( G$M?xcυ3s볷^bX t nzOaʂ;lwׇ{nX^aGG~uabP?ܶ,4?8 |6f5:r%r44+ѳ`'-VW5:Y/}:Ͱhª Yz?0.?//7]__Fكqv=Zc,3s4I wֹ{Ԋ=^%t ;uAoό@aCpVոn9qLӃϘ[d.BF! UѠő@K=}LqG+'˚k oWHﰻy;j}O| ]:kxXfs;ٜ A[|UD$or-/~aqtvomtO'uƘ{uxJs& *9Dbxa͈ZQ?=8Q=%pPs^ |-HoJ)T4)hkh2;q 4de9@\>~8&I,)ՉHSNA~}x-K D/ILPrIͦ`/ P8ѨI .3;-dMPqv~ 7%xcQ`Bpl-'ñt6gk6ީl}hlM%iG8z[ZuWȾ,}lPAƛj%sVDNqOrPÐj科S=#<⿛.ύ[ '/u[!\k<aGx#)|?L6hx_J`7gT;DZ eo"hůy*˷#EC|q°̈́Z}oKvZMy}| l8D7jH4*T|Kp_w ;;=_#p݉v|tF2+ﲽ(cn7jiG7+tYc|dג[1f*0qQo\R·@s*EsAaㅌΤ0#dw5fMxۗKM>x#v#o?}o w֮"oXa>ɣ-m}2JNㄍUd+IF$˫gk7kq: wfӅCH\=h&&8>@ f*jo~vf (P8Px  ]tqp@YgݸF~{ڒ"#0v g! nI3jöMQw#BʷK!6$g(Y#,Ƨ * `M@!VRkOTVgCK1`ܡWvc{php![ʸjʖ$ځ2EI;Ft1cB5JcCqR1qDv$Zል2Cj7:Z9h*D`vE[41J`sTYa YgNJ# eIR6Ύ9$P IK/$y*y6b;,\1h`lJ’ &MG SskUhCuQW3OQrGVi{-Um^[{Q[_j&!&佟!5&6I&%1@FrŔ<uDК]"5b#E!:Fтtں-ȶGVz圂15RKᒏj$#BE3 [XbESI p D-+[޶A'ZXRk)GJN^}FRB5&PDʦN1T#FK`"HtRҔLK{jcE V!Y'z :U!CX5>5ih.:7Cj pVLBD&ش Fr˴BJ񓚌k#WEKzLX}9 '@hIK^YKrFfF9}Q[f]X,}^5?T5jCi cRw) c R-{Յݦvl9IJ8`6M#'3F]A39PDncХBZq|ngԪp yȯ!hw2`'PTc~=0qIU[V+dYX\T 2mll%RFRϷ2@!P\DKt36U;kyz9 Wv}|`(.>Pv] <]=\ZvHe5% fj54 i5imUm {D۶,7f۶*;%Q]ץ:ChtǬy{ocZ*,VR&/~{qi*U G/2dftC{ oC摚yp{\. ^zuu[6P5zDPӵ<|3!i1%dәT`+$&{Q:-Ĉo- ʉi(xTdyjjhW;~#y*eA)ULR=uj#0O 80&t&:4Rl( 9.KrMфH;E&~?qr,KǂELD8%-B&yO1&]}d#LIy^ &>e-Z,ԲHhcԒ o7H_t 16&[gt[CK>hl ` يgf^J<+*[P,݈ˬx딊[.(BT: N) +d>)CH%0SY `Xlon,@^--aY 5CBTV6{0m3I,j}zf9؅\Baw=xT49w yCۓ*dHj Js5YFIq8vUwקC]0hӉ%Nfj@+SrCfƞ+|}P<.m1ƢO3뒁^>rわ*$[>O#v1IdƇh͚L`cQO1jO5+rRd`5Yc6_~ k"T*Inv(e2Q,]\>ADB1 .RmY_%! ԍQ-S%k^DӲ#ʘ(D 4)†CYXw{'%C9\4sOVV/u@(j3NᔖTB6K&1(+GNf'R>rZ({ח|jYH)Z.hB 6ħ#1gblq]&6^8tQ>("J6,]VUtK -_4ـF Nw>lr-uk)icJi6ѫ"&9)|HMP@ڰ#Y HsPG#XeRTAR^oK#M!^V%P1|dcZ:>%snWyzEv?#J!V@}Ӵ?^'i ᗨsj o׽VNw7)ABn2տIUL/τt-%;^>lWOU3B x_: 6$2ui:נbcbڌQd&?g@A%H+]>%OS7D,}"DbdܙiAƲMg"{cGϻPr!U(|JJ\ȵw2m>MONkgR #;{w_tmӚya 6eޓ}JBeN> wlOwmS3L94]9.b㌶ vicT ] O-ޮ4df#b:,DTm:L5zdCu]>a'HHֆɞ>_l}#pzg厯iA!mo5 )^*m4wɷ..ܯs%yRϭH24[5)[|s[]5:nӚt67 f݇LۣMb{yvve}eS?]׷޿WiS ["Wlo?(4ʻ9Mmnbt4@_*_F,f?צNʞ3M/I7d5nv(K)i^7? ?d˼4;Kdm]hl z p] "3[qDl%P"4HFb0ZÞPD=J5- iȧ0IXå\ZɅ3i,QBeʸ06,ZϭB`}.˸0,Nќާs!0" .oQ"hJn(, | pKIN 0e{\0v‚+(<,$3hVg(yaY֧Ct˴d(x%Ll r#bz޹Rv#Aj,j|4eR2 =34Z,U8m֚x05X%WR?iBVW9^/H&ՒfMr*Sx "Vp{*ITNm:TZ)ݗR 7UݫePyt-Iav qw?C0XI;5dz Gu㜁ZX}e* Z0J+Rr(C[fǔyJUcE 6"o|= <ђ/LlRȦE o޲x}Kxm[ LqWlF=ʹ'Ldg򖱭gQY^\)>G?1Lp#[TuzXXA.3𯕩w}~"o/wJ|-97<[_G\_]v?P~ڵ_ĶN$}`$9b#Bi;jK{-na4]k;mkF3@9!CjR8M#kY[@kBdQDr4*GȚMO)(wA&Vi ȤQ#J5{>MhާWMYRdM686OF$=U!kB8y2ܾ^m5~+" k64FXB8_y9(|٘ТlᮕNb Dc]Z^;ZW(]3CfЏݤ#޻ݏKVFUw^T+z-S0}k\wh}Xށ\X9u_Fq .4>J? yKQFm~*H9N睊(ؓtR`mr#wý84S.T^)Fi[ q&vQH<3ɰ@R=[>mlK6=m+ZV(-!JMcIYQab-Cɪ?OkeZ)ڤeiQM)^1F),i˕2ս Q?{OƑ_!ew0T.2l r|¨Ҙ U)W)ZzG Jk3Z*زy@U՞ɻmQ%bT[R!ySi]I؀EbPQx甁#̫j#"ZEMP=2 @vdӅ4ԸK)yh]fBSZV9mDc93#DRdZ$i9@KKJ.ddo OA!6cQǖ@#Z. TEh`.c"AtZ urabHFiL^Qw PE7uٽ4wsz6uˍ,?_|iv#;o8%qz}utX~! [gh_G/nd;`Cy(:> %oJ|NWӏhy/WKm@TFj7>N&ȡ[poC/׿:A4H{~i fy9HJrWA(j_ؔ'I9>3N5?ܤX⠽)ЮK 4}888=L8+}0*|^VDA[vƮ r'H7H}Qwn q?..}5z./f#cΌ\ρ]TWӎ~_yʍ+g+rL4`n4.kLݾu+=󏑴HD-̨r=wGXyej}=cƽ+Ѱ_h_->8qk<bM" `<UU}8u&A–D(z yj~~ΌV$*A vh(=:۞a{Dw+W/cĵ߃j*<+ռ?x,6jhִ/5 ~ Be(k9kEέZfc%o@m}3Zɵ!šUϾu#?whekl旍4F3]|]B{Oo&aȍMi7j)rEwGR^4_I4_~!e\xTw:^x|~D&Ťhh'úE`dPZ pפ+AʎmvN?nOg5yJqWt9aiQӎ8Bn"jLIg:}T"DŽ"Le+ T\3'W08@} ,bbtŃg yjz=\{ ٳ0w%c6 T%i̽GN[:`ԟtBHҧQ<<99SQ老;q4FJ:x e6(:&f;@ð؄dш7`  `i3U '4DJias;_#)b DhKh*FSzL%VPMwU͖5' fidʛГ輖 ³MA4WviQ.PD>#]SU2:J1rqOoU-Ӵqݯ%YNWTF VEm Tt= PP QWLJQl;)׈a{tepQm.9]^+bԎw<v1AǮ=h/-1J@/Rm~0a"Z śyUH-//}w|8E7㉖xDN*ATh(WPJ?؅"%J(Mtk,%DowFRx-9q\3!Y"M۽+5DCͿH>8Iu+eL<8Az V1 DLPF{4I dͩ5\|z{;X彐ھsue'ӣp4r*?~BgllPT(~Ot "L|=G?]\ #LVWRݿ3'(RfŊԩ_[S/.JQ9D(/^IGcE5}AU ʍ+JpsukeوI멜.-R\e~w勮uQ%4UAY.zM(¥Jf~z5h+']pˣ+sb:O-ES3vx`}wWA jHEEӃ8^iRƦ-wǻ +,gfY?}!<BQIQQnѮhcq!eO}[G>r0>b_"p P0U6X ݼ.̈́P6JBrE@M)1! Sxm3)tV;?Se{QiNCXVIIQ:YtD)Kݯfmb~O8C3W"~x110Փ1Ab~gAe͍?hjEk&XXЃf^\kmHE`m/70;,0yɀ諬$j)RsH KΥn]]U2VWM3:rǵEqQf# F\@0!9+r4AEJJY2J˹y]_ԩܱoy󦼵Y1k "C%_ȑ *~&@REWJ|(lܢqGiٹ^ B#땰#GA zks`qj'[ю*7ӄ)AHC; W zyi)P7KȫO]h- {y8ˎ<đZ! ( Jk*)ML" N ܖ]0z?dy)d EoO h` O 4JM{Pb.ԫRЎ⋗I5Zk.4ÄNQ796~s=jCK9"oX V& z#nG޷zhHԋN-r y~PwW^gz}^=4]"&7znoUBY!Su] P-E NGI>dٸ '<iHwF]L]Iy bV.) 4QA<jq!%7'[EIE] +%=d+\.]8xu=]}bG W[[}^ZMSJ4ۚrkqӏHyv̖X|=3Ŵt9spUkcy}i?:DnTvI  wY Ga~Z'!~=w/i'f vxڮs6F2[TeNH1@|V*8hb£aHsbxMUCrƚ-FrsDI%mA[Ƥ2!X&b27>!S"[߿N{ӐD,NVd{"|<ҥ'O',`TDfAūLwNBꑃ@ )KV[%e]mu`Y]NR_k i5]P 520lmӁD}FmZ)m@ B1#;7٢ZTR-F(&%/Lxn;0-vu* +MKAUwBJ);:50};E ťԈ#Bm9 ]5c:ɷBe/Z@<E3[#/_M-,({^} rkQmoQ:VjskBiV+ڎAKILH<$gˈ zĚHoے?JHuBxɀcKf7u~7mvHn28lӷ~6悥 -V7 n_?4…G7YG9lY8b0 \Q6J-9Bp b`mYRQapŧ-b (}9XN>G=B!Y[z2(r~tv.4_7QU 4J[Z~:&_ _]}r-&'| " mu͟NGͷџV;'D;^sQ#9* ڇ-P =|Dflpi>F;FrΖȗ;4RƼ| m'BV2tT3AcB0:Pmg^OnA.hq W?ڹp-{MƦ,/5e娉>z)jc$-fwauEW0j.Rj?ۏxQhSң-r5&Ej,ݽAovMJE.eg"oͪ NO ]|~}5Y~ Ɩ_J:蜹34&`aUB=ZnH(sfsrp9[/,4ׇfݸPP,?/wgdA }nH,nmA)KdnQיF1菛sΤ  g҃z<뎒ܷ[]Yb$f#_JLȼAM &3!&(ɞw7Fٔ]cϢ{?+YGWP*}l,5:bXWDcoo>Ls ALy^3k[*Mpz+J;֧;j̋GQ6jAwTPÐ"wLf泒Y ) rH6g.rĺ@}qChஒ= z} Z 3ԜfVF\ȨL.6!1vYVQ.zlw.XAE;%%:4}snƨMŽYAP԰?K+S3񇋫-N1W)[^ͥxrZ>Y&xJɨo)y@Rpq }K@m;6 L]> r$}FZfUuo$-JXy:N MF_pz#5?qF+OފutH΋-#+#G\rgq wMҶVT0*d!UFOq}KyВXl%%ډ cS:3qӞ&եV8iV FW>|WҾg|%)ݐ&Kf;wC—c\ z:53 r._zpW(ΤgɄt=IR3;<'yFΙdlљwtN3%?7wwO.vfH0 uMoGkٳGg nr,ueRLzYܽ'5*ߣ>L&B {=PdC$mY>VU=zQ%q f"s"NDb/ཿ|c'?PtD(|8<[ĤGi^6??NRdnNBsAI6Vrμ,o8Mԕ!AT=u(#ziZțipF@VHAt0Ÿ:nUB0717&+):PmWF"˶*=Nq:q'+*j)McF!GG(@xo^tLCȡqxBWˁ} W7-3hdA' %YNچ$<``9BgC,͚ݠb-|zT LSY~$ dS(b<ceIMUyߋ&K8jj*p.4r,LԊV(l!y^~_=?ehv6\@xF@I^P ,0v\ܢ4`-VK40 gp9|`YHyoo*MoniXqI"W01I'|IB*YXm.wkWqa\t &c%1Ib,q]48Bjٻ6rdW|݋G,`0]sg >m%HG~b=܊Z3r HGx)6Y(7:ɰsGI?h8*7Ȕl8 QygVk#\Y~|}?{F)ԩ~8]jƥ4)i/)uˤrz%7.fhy\p(8) 0H"fYдW b~ІhFWi!I<LDTȼ#a}q!}>t`uA=LG+N1g !ly@W:>>ņӆ0'G/XYrH+b>}<(ދ%+A׉FɼEǬDDLaSٙ=?Di$B(.TnڱAdWIҾna(>%<΀du7truJ# 4nOC:d6#fjZm$Ƭb} Lڰ$g nQ`wվ kYqI5:湪;aufZiX!gt)8 3Vځ|c^=ӡ/T#Aս"s,* *0rg#^5|rɯ?@G΅aҞج;l7P5sV"3fK0'n>~V +zBxxjѦd |H V=f e@DX[dL\Ϟ+ڀ:NVv@;E-B_ﮝ]}CJzЅR,b=n40$K?v8%8Ÿ暫3@18%e0kmdkV/ɍ*qi%Чµ~[)'{šOy{FaU1؝@Cj4 pXFaX!H~T.ɷ#4 [( 5̜8}`dToٞhscdu !7 yDn5h%yWpLhwX&UY\.)XKze7H(]ՈXlV|Znz1iNO6/O$ӬSlnUmyvhTJ6ou 8{\@< y+-HGV[ߪjx;xuRjrJݣKF() kY MV  5~o$}BPݫm7iMfe`qY Q ,UBR=~ZJˮ>:NֿQa@U*ݫdW@\S/VAמxS{뇟%'~dr$2wdn,wc8#ቺ1o^b%p`(S^r4ǍP#ޒ5Eu~J~XM4_]I_ ~߯j44Q\vaH^?,=KxG,Ӟ@߱7VA iTmz\\KG߷kݭ])ϒ<_+qy݁ -) A*BVIo|S|n\vJHJpnh}Wo}e}D<+hDÝ>:\[ 7$z4;w_';W* WM\]ڷl# }U-MskT[:tss\0S8!$b(Q\1!R25rH"Y 7R$e _;dY'_j^.rM^9o~t82I{=Z(&9߷?3_0w?~zu꯯ҷo~YϤzd_+;ʞvUt]~Boڌ/yup'/hhMz~vY<\t%gsMȕZhl)`<(""U21*gbvT'[)1A xo&h9F;iDQ԰ՐCJ  *hZ'|4'~]`Mx"KH= /śHVePݺxиw/etlO5ctt2#Vyד~?=*)vzt(uqPkUKB| NFiJoiRAWط lvѮdbZޱCE}={ӣJR;DZQ}CjO7 $;4R>4liRlf#WzQ%:}.r=V\6A[6ܻX\rA͕ )DUJ2+1d(wAkPJá_L| Lz#P؇Ps 󕡑/BvKNa( 5MzNzKx4{wu59A^xx16i:X%rg gWsiS,$5UG0WU1"!P-/b& |cVV kmU */}?X;Go,ݽ"WVKIIcEY§ĬhatP*)Ex&P#{X`x lݢ=;&meVy^TM<%5M Z8)'|w_ ~ BJ74ީGc܎ jU;^ɲ]l) `tg#7mkߴځEo>K-E[j;)| te 6P|R2q#dW(KY(SCdk9nVV ,wdDp'1he PpLp'5w˶~ii4EgT*5 Λ*D3Qb; |&~]b;)ӔH|,OCR~P dR&D,b{O1cemV.F3|,&gZ4r#嚏QbJ$y|(Պ`̰tX]e2`A&Bg(0G#HPzu:"bzNOMkM1eע)HS W١[=ko~bJ aF9rdBKuGk"{s05c┈Ig @["k߹Ƙ :Nܒ>3"PH.=Z뾌 7ZV'OV x J",3eFEV|a:!,O`=uNY 2s;YT}@ %8WE=7rI] #m~{koIЋNTO"HRѡ{p|l" 2wHե`JL.?&Po$`*(΢ 0>p ;\9ߘ1tNW]$٧FXETd!%|Șx]{J{'9գ|jRI"ծ3/1rCO.Nn5g~t_0U+)BϚ(uJoe૔!JoNU}p$7|*lq%%M!&'<2H21,ΰLY 틝v5̞_pvqY P"R liS@)auZ ;$uن$oǏqԮgHs#!.5:5l)_?1,?d[No3ɋItڍ %{uMͿbד̂AdH/,2o !|lv4 -;<"+d%_ldIcLbꮋ*U|phLapxoAU6z> %~؝ZF)Z([ f'10`V(1UFGXej1Fv՟bҁ@ hɣWgxY-aZ;#.<|ya'OիWIܰ"$h&9vGp(D_|LWȓWQd&W_dy|ZdΟ|mީH&_xBY@` ^ԧ:jg ^eu0!AzjNJ[J.R\qXΠ:U_.&tЈi-1ULAKt]q (P%cGj(TunLJ~} jU7!t9e χc; }:=<dzJh*:VZW`>N+5ׄm"z ijp^ V{ ꫹ !G8Db*l]Rk6ɗG\@xQY=3oV"DWQRE/5cxp_߆Vg?-'=(8f3a-3Ei!Lm4[HdP)p$|O{$+"C 0 ?Hdt{ pTQ!ycNUd/kPR, ߬hl] d8`}B"f۹_I@җ]F$'z؞^/ lׯ_NH y2zeTGɖ]48\]xoV Sl幠-!>m^ӂ3W)GQr$Z0j+DrbncNdn#(5[˥ +<'x6upA NB$-Oee'8$7rKTp[r|ƣY/YA1bY+a^o F"[丕# V썏zByi OJ RET,' R1.,OsK@>(-PeJ2#oʳ"ʔU&n_s12/HfZt@aJ yTp ~ Ld1ص:LFQX-F5vs3q:<4WQ}:@vJ"r0+l9Y(t'I<_ ~l+ZsaR|{jhii<7?@~oqҳy_^Mc{6u"'nH_V6OM𧎣#s2,,#寰x: 5֍6UW??;^\Tk4Vn42ft0[6mjs_,Y*O<(y~ؾys(y]~k8ĝldǣ FJ5#Gq4[3Ng'&{HvY o$ȽN3 Qo|Ak6">޿cBgv)A2%lAxׇdrXӣ(i!4KvY4z)?oAa?hGKߦ9h(ϹkX&!`N8|'}ɗQq:B:+zC?j TvKzivWvK1󣵧Аi,҃"" &!RY0{Y?.$ pDpؔ\#YHDgVt0S]=6|Y.G-#fUNt tZ 8s'٧ݖ TrvawuFB8w7"ュ;ެ*{?OWv(M0ԯoAa׻oR8yCl4zۓ R@|_z[r~}-l8fǒ-6*#Bk3cԷ~l2ްB]誋T!rϙ4'@nS}_妷X0B= s9~`Dʄ~uyl|~\,0l0=Z*{ # P:|J^=cRٻċ^0 FCCenvCR>asꭤ4!yxx<f%ECS80R #šLqb\AE=1vc{UCM flLJ㏁P!AzjͮvK,<%k-LPY/y0UFd*&;= ySLo3y@Yz؈a_/BMݍRr)RZh*f뽮 [Pt^<K.:4dQHvykNyJ4"p4# |̎}$`X`ˋIud'q\M3'nvְrv_(h͏]~2@s%-~pnOqGɑ,rW*nus^ޖmŏe{<³\qc脩GxZDợ ukiڛFGցZR>WqyѬv0PC(Nf^ "=IGfK"`FKaXC!AEft&ALeގr|"FC-߸.?7?cyK˟[z#dŹ>(Nd\u>8,p]Ǔc{Ua_SќG90}M }pc Uyޭf H1E-TRxD^pVFB[򅄶&!"z%+!|:sg9e:sCX4p Rr[qTHdҳujEъĽWX~ I51*2g:RwemapÉXjU*x"ȧntF ZPIOe>QJ7v :j uFGXaYZ0|=[glf{#g Y7Y(y&slf{w6G-/!^x۟]ipԈ@為4Ȳxsػyt}MA~x8Gx_Lےh.5 u@k\2a'^^3r]ET +gy9EL52s7U'nhMn]iP":]FJgnݳD 2$䅋eJleXeXE.wa!R\یK }"FuOB{J|)kIaQ$ (qz*`2jQ d8s4r/楕u\ue%ϘcʤBB_hSN ?@ {V%oL%hUp <Xg^^lb6/C]r7<۱ BX:yG cyʣc1'J{~ʖ}fUae&=vyY-]z'ON:!%^BvvgOAGVjp6cWIŝPjnx=nhT*v$0C`<@8]3 U+PFyt$޼|[$ӣ%1["¥ְ\ów7'pZO()8QpwF*ˢg\`XG{ϨH-yS{wVYx6HǣHfb5wyoӛdB$h(æ`wNW_d;PFL^D_:h ] d('&4 Nv]6] (ZDXq.D=CtN0; 4Q66dLQ9|A4l@%_o$S t&B"l&rT@G]X3c,m>5G֯fD,ҔY)MҚ)KSM>A׌=fiSSӊ8B* [8qn"rm;c9h ݼ@5W[RM-ע@3u`UpΦb%QD$Jx%\`.T<m$N%+yd)D%B.ktc*@ʚ'KZC!u@EeϏdN;ZeJwm E5<V'gGunqgMP<&>H*x$@d n=%n 'w޿R9rpHy3qwpJq8Ƌ%s!momrG=+=8SKݹފXr{'S]eS}'m8`h ZM4.n9P 0|v98d"HU?ϳ%TLridBqfRZ0S1!L6$.Q~DŽXL3)B@|VFP yIIIȉҁrG Q &,Ұl&N^חS 0QT)H"ͣjP$bAAA?<ąb4|nѰ0Dekf]3v$To.[ FBHc`iZ%s΂2Ao;fBoO!#KmZJE!ޅ`#Pc\1_%ӘU~uf˩!$Oj{W6E!al,(ТǾl@HewC+!EYrI_jfᐜ7~.D,P}|I.<veë}~Z_.~hTm_ }j d^.֣vY{Z5c a5@T`xI&C?cd!~?Ê l<] r6,cQTb&t)BDG&sTV7v;`wHzUɋ_ƫʍez.ٵIn?~ G)`rǂ bV:`aSnz[U"[,۫[2-U._uozgbUi9ůJcd+ $`?6p8^aMg#\z|1k=&g%EXf&6$IrkqS/ vaq%)d'aEEGsA>oBui0FQ|b Q`BMLcfE1H0OZW= XhN0ۉqe[bE[?;uFg(r%9OWhv-֙3n:sƩqSӝ`xVl^l_V(bG9 lպwU>NFGI`#J gt֫}?SꀤaܵFE?l׻пfw;<úF%iy;'&=MWF*h}5_=Ίۿfr2Ӹ<n?t97A?7rlroY[`Dln ܖ9j0l6?q^Ua,kEבMב~"P_Gv}'@S`qɟFY6bNo{bH yTE"mȂS)omҦO+b)ΛZSTܻGu~@DP̈́~i8{jT8""u!?F엽{MR.EuBc݆se`LyMOmƒL[hN Lg[ \T'>m]¢?-<Ѻu!?chW%Κ*ACC 27C.T$0aCȀmpi>w2i],Qfl&U y4-ܔLxds7 З;{0(dzHٕX7ᲛKu,i׋KlF(UZdwM?SrneiʂDE?.BkEz7t}zzGO94PjpSfw *K}ՏZ[26k N2óPq;: 6N!K(= :a,*$UFھv 홐LI{pg~lߙ8z1ABTWt&cH"F,J QL$ ≭e-smĂ-EǨQ";qi56Lۤ'hE;_đ`#i!x?DE']*`I288rKVdd9N $ ,cIqHMD#m#N@: v xfF12S$px:׆d).v#C}E ~m,C=bBd1BatB Ѓ-e캭Koks$bs*:O4Ԝb$oDfwd\S_+JZbE"X<;w(Ikˎpox|$Z_HzokAJ Qx|9J8ʒ4bu&4by< `5!;Wbf97SDqP ʅ]LiSFL4$P`\gL$*)L"3PC9? fvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000003152645615154574045017723 0ustar rootrootMar 12 16:50:57 crc systemd[1]: Starting Kubernetes Kubelet... Mar 12 16:50:58 crc kubenswrapper[5184]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 16:50:58 crc kubenswrapper[5184]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 12 16:50:58 crc kubenswrapper[5184]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 16:50:58 crc kubenswrapper[5184]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 16:50:58 crc kubenswrapper[5184]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 16:50:58 crc kubenswrapper[5184]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.050090 5184 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062725 5184 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062757 5184 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062767 5184 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062778 5184 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062786 5184 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062794 5184 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062803 5184 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062811 5184 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062819 5184 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062827 5184 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062834 5184 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062843 5184 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062851 5184 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062859 5184 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062867 5184 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062875 5184 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062883 5184 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062891 5184 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062900 5184 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062912 5184 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062922 5184 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062933 5184 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062945 5184 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062955 5184 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062965 5184 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062973 5184 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062981 5184 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062990 5184 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.062998 5184 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063006 5184 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063014 5184 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063022 5184 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063029 5184 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063037 5184 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063045 5184 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063054 5184 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063062 5184 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063069 5184 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063078 5184 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063086 5184 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063094 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063102 5184 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063109 5184 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063118 5184 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063125 5184 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063133 5184 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063141 5184 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063148 5184 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063160 5184 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063167 5184 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063175 5184 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063183 5184 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063192 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063200 5184 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063208 5184 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063215 5184 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063223 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063232 5184 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063240 5184 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063250 5184 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063257 5184 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063265 5184 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063275 5184 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063283 5184 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063292 5184 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063301 5184 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063309 5184 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063321 5184 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063331 5184 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063343 5184 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063352 5184 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063360 5184 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063368 5184 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063404 5184 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063415 5184 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063423 5184 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063431 5184 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063438 5184 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063446 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063454 5184 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063462 5184 feature_gate.go:328] unrecognized feature gate: Example Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063470 5184 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063477 5184 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063485 5184 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063498 5184 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.063509 5184 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067671 5184 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067711 5184 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067722 5184 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067731 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067744 5184 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067756 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067765 5184 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067776 5184 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067785 5184 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067795 5184 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067804 5184 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067814 5184 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067823 5184 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067832 5184 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067842 5184 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067851 5184 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067863 5184 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067876 5184 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067889 5184 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067902 5184 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067913 5184 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067924 5184 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067947 5184 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067957 5184 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067966 5184 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067976 5184 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067985 5184 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.067994 5184 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068003 5184 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068013 5184 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068023 5184 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068032 5184 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068041 5184 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068049 5184 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068058 5184 feature_gate.go:328] unrecognized feature gate: Example Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068068 5184 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068077 5184 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068086 5184 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068112 5184 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068121 5184 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068131 5184 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068140 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068150 5184 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068159 5184 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068171 5184 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068180 5184 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068190 5184 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068199 5184 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068209 5184 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068218 5184 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068227 5184 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068238 5184 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068248 5184 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068256 5184 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068267 5184 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068276 5184 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068285 5184 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068294 5184 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068303 5184 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068312 5184 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068321 5184 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068333 5184 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068342 5184 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068352 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068361 5184 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068370 5184 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068417 5184 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068427 5184 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068436 5184 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068445 5184 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068454 5184 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068484 5184 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068496 5184 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068505 5184 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068514 5184 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068523 5184 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068532 5184 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068542 5184 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068550 5184 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068560 5184 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068569 5184 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068580 5184 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068595 5184 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068604 5184 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068614 5184 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.068624 5184 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068809 5184 flags.go:64] FLAG: --address="0.0.0.0" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068833 5184 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068851 5184 flags.go:64] FLAG: --anonymous-auth="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068864 5184 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068877 5184 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068889 5184 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068903 5184 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068917 5184 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068928 5184 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068939 5184 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068950 5184 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068964 5184 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068976 5184 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068986 5184 flags.go:64] FLAG: --cgroup-root="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.068996 5184 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069007 5184 flags.go:64] FLAG: --client-ca-file="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069017 5184 flags.go:64] FLAG: --cloud-config="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069027 5184 flags.go:64] FLAG: --cloud-provider="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069041 5184 flags.go:64] FLAG: --cluster-dns="[]" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069055 5184 flags.go:64] FLAG: --cluster-domain="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069065 5184 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069076 5184 flags.go:64] FLAG: --config-dir="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069087 5184 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069098 5184 flags.go:64] FLAG: --container-log-max-files="5" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069112 5184 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069123 5184 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069135 5184 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069146 5184 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069157 5184 flags.go:64] FLAG: --contention-profiling="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069167 5184 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069178 5184 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069188 5184 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069199 5184 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069213 5184 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069224 5184 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069234 5184 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069244 5184 flags.go:64] FLAG: --enable-load-reader="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069255 5184 flags.go:64] FLAG: --enable-server="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069266 5184 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069280 5184 flags.go:64] FLAG: --event-burst="100" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069291 5184 flags.go:64] FLAG: --event-qps="50" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069302 5184 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069312 5184 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069324 5184 flags.go:64] FLAG: --eviction-hard="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069337 5184 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069348 5184 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069359 5184 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069369 5184 flags.go:64] FLAG: --eviction-soft="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069418 5184 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069429 5184 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069440 5184 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069453 5184 flags.go:64] FLAG: --experimental-mounter-path="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069463 5184 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069473 5184 flags.go:64] FLAG: --fail-swap-on="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069484 5184 flags.go:64] FLAG: --feature-gates="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069496 5184 flags.go:64] FLAG: --file-check-frequency="20s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069508 5184 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069526 5184 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069536 5184 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069547 5184 flags.go:64] FLAG: --healthz-port="10248" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069558 5184 flags.go:64] FLAG: --help="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069599 5184 flags.go:64] FLAG: --hostname-override="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069610 5184 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069620 5184 flags.go:64] FLAG: --http-check-frequency="20s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069631 5184 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069642 5184 flags.go:64] FLAG: --image-credential-provider-config="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069651 5184 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069662 5184 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069673 5184 flags.go:64] FLAG: --image-service-endpoint="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069682 5184 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069692 5184 flags.go:64] FLAG: --kube-api-burst="100" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069702 5184 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069714 5184 flags.go:64] FLAG: --kube-api-qps="50" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069724 5184 flags.go:64] FLAG: --kube-reserved="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069735 5184 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069746 5184 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069757 5184 flags.go:64] FLAG: --kubelet-cgroups="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069768 5184 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069779 5184 flags.go:64] FLAG: --lock-file="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069789 5184 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069800 5184 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069811 5184 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069828 5184 flags.go:64] FLAG: --log-json-split-stream="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069838 5184 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069850 5184 flags.go:64] FLAG: --log-text-split-stream="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069860 5184 flags.go:64] FLAG: --logging-format="text" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069871 5184 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069883 5184 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069893 5184 flags.go:64] FLAG: --manifest-url="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069907 5184 flags.go:64] FLAG: --manifest-url-header="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069922 5184 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069933 5184 flags.go:64] FLAG: --max-open-files="1000000" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069947 5184 flags.go:64] FLAG: --max-pods="110" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069958 5184 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069969 5184 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069979 5184 flags.go:64] FLAG: --memory-manager-policy="None" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.069990 5184 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070001 5184 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070012 5184 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070022 5184 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhel" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070048 5184 flags.go:64] FLAG: --node-status-max-images="50" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070059 5184 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070070 5184 flags.go:64] FLAG: --oom-score-adj="-999" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070080 5184 flags.go:64] FLAG: --pod-cidr="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070091 5184 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc2b30e70040205c2536d01ae5c850be1ed2d775cf13249e50328e5085777977" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070110 5184 flags.go:64] FLAG: --pod-manifest-path="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070121 5184 flags.go:64] FLAG: --pod-max-pids="-1" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070132 5184 flags.go:64] FLAG: --pods-per-core="0" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070141 5184 flags.go:64] FLAG: --port="10250" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070152 5184 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070162 5184 flags.go:64] FLAG: --provider-id="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070172 5184 flags.go:64] FLAG: --qos-reserved="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070183 5184 flags.go:64] FLAG: --read-only-port="10255" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070193 5184 flags.go:64] FLAG: --register-node="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070203 5184 flags.go:64] FLAG: --register-schedulable="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070213 5184 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070232 5184 flags.go:64] FLAG: --registry-burst="10" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070245 5184 flags.go:64] FLAG: --registry-qps="5" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070256 5184 flags.go:64] FLAG: --reserved-cpus="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070266 5184 flags.go:64] FLAG: --reserved-memory="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070278 5184 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070289 5184 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070305 5184 flags.go:64] FLAG: --rotate-certificates="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070315 5184 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070325 5184 flags.go:64] FLAG: --runonce="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070348 5184 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070359 5184 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070370 5184 flags.go:64] FLAG: --seccomp-default="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070416 5184 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070425 5184 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070436 5184 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070446 5184 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070457 5184 flags.go:64] FLAG: --storage-driver-password="root" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070467 5184 flags.go:64] FLAG: --storage-driver-secure="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070477 5184 flags.go:64] FLAG: --storage-driver-table="stats" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070486 5184 flags.go:64] FLAG: --storage-driver-user="root" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070496 5184 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070507 5184 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070518 5184 flags.go:64] FLAG: --system-cgroups="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070528 5184 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070549 5184 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070559 5184 flags.go:64] FLAG: --tls-cert-file="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070570 5184 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070584 5184 flags.go:64] FLAG: --tls-min-version="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070593 5184 flags.go:64] FLAG: --tls-private-key-file="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070602 5184 flags.go:64] FLAG: --topology-manager-policy="none" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070611 5184 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070622 5184 flags.go:64] FLAG: --topology-manager-scope="container" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070632 5184 flags.go:64] FLAG: --v="2" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070647 5184 flags.go:64] FLAG: --version="false" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070662 5184 flags.go:64] FLAG: --vmodule="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070676 5184 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.070688 5184 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.070933 5184 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.070949 5184 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.070960 5184 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.070971 5184 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.070984 5184 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.070994 5184 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071003 5184 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071014 5184 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071023 5184 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071032 5184 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071042 5184 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071051 5184 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071060 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071069 5184 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071079 5184 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071089 5184 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071098 5184 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071107 5184 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071116 5184 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071125 5184 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071135 5184 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071144 5184 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071153 5184 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071163 5184 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071172 5184 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071181 5184 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071189 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071196 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071204 5184 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071211 5184 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071221 5184 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071229 5184 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071236 5184 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071243 5184 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071250 5184 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071257 5184 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071264 5184 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071271 5184 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071281 5184 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071290 5184 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071298 5184 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071305 5184 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071313 5184 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071320 5184 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071327 5184 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071334 5184 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071342 5184 feature_gate.go:328] unrecognized feature gate: Example Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071349 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071357 5184 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071364 5184 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071403 5184 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071414 5184 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071425 5184 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071434 5184 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071443 5184 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071452 5184 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071461 5184 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071472 5184 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071482 5184 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071491 5184 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071500 5184 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071510 5184 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071520 5184 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071532 5184 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071542 5184 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071550 5184 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071560 5184 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071569 5184 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071579 5184 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071588 5184 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071597 5184 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071604 5184 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071611 5184 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071619 5184 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071626 5184 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071633 5184 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071644 5184 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071653 5184 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071662 5184 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071670 5184 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071678 5184 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071686 5184 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071694 5184 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071702 5184 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071710 5184 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.071718 5184 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.073127 5184 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.087793 5184 server.go:530] "Kubelet version" kubeletVersion="v1.33.5" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.087863 5184 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.087986 5184 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088004 5184 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088015 5184 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088029 5184 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088045 5184 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088055 5184 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088065 5184 feature_gate.go:328] unrecognized feature gate: Example Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088075 5184 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088085 5184 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088094 5184 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088103 5184 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088111 5184 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088121 5184 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088131 5184 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088140 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088150 5184 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088159 5184 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088168 5184 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088177 5184 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088187 5184 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088196 5184 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088205 5184 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088214 5184 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088223 5184 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088231 5184 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088240 5184 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088249 5184 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088258 5184 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088267 5184 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088276 5184 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088285 5184 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088299 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088308 5184 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088317 5184 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088330 5184 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088342 5184 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088351 5184 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088362 5184 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088371 5184 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088409 5184 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088418 5184 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088428 5184 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088437 5184 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088447 5184 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088457 5184 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088467 5184 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088476 5184 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088486 5184 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088497 5184 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088505 5184 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088512 5184 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088519 5184 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088527 5184 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088534 5184 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088542 5184 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088549 5184 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088556 5184 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088563 5184 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088571 5184 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088579 5184 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088586 5184 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088593 5184 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088600 5184 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088607 5184 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088617 5184 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088624 5184 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088632 5184 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088639 5184 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088646 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088653 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088660 5184 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088668 5184 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088675 5184 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088682 5184 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088690 5184 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088698 5184 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088705 5184 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088714 5184 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088723 5184 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088730 5184 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088737 5184 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088745 5184 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088752 5184 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088759 5184 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088766 5184 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088774 5184 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.088787 5184 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.088997 5184 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089012 5184 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089021 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089029 5184 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089037 5184 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089045 5184 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089053 5184 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089060 5184 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089067 5184 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089074 5184 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089082 5184 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089090 5184 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089097 5184 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089105 5184 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089112 5184 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089120 5184 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089127 5184 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089134 5184 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089142 5184 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089149 5184 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089156 5184 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089164 5184 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089171 5184 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089179 5184 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089187 5184 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089194 5184 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089202 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089209 5184 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089216 5184 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089223 5184 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089230 5184 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089238 5184 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089246 5184 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089253 5184 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089260 5184 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089267 5184 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089275 5184 feature_gate.go:328] unrecognized feature gate: Example Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089282 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089289 5184 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089297 5184 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089304 5184 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089311 5184 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089318 5184 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089326 5184 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089334 5184 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089341 5184 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089348 5184 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089358 5184 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089367 5184 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089405 5184 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089414 5184 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089424 5184 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089434 5184 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089445 5184 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089455 5184 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089465 5184 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089475 5184 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089483 5184 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089490 5184 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089497 5184 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089505 5184 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089513 5184 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089520 5184 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089527 5184 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089534 5184 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089542 5184 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089549 5184 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089556 5184 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089564 5184 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089593 5184 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089603 5184 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089612 5184 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089621 5184 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089629 5184 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089637 5184 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089644 5184 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089653 5184 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089660 5184 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089667 5184 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089674 5184 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089681 5184 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089689 5184 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089696 5184 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089703 5184 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089712 5184 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.089720 5184 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.089733 5184 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.095908 5184 server.go:962] "Client rotation is on, will bootstrap in background" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.102327 5184 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2025-12-03 08:27:53 +0000 UTC" logger="UnhandledError" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.106944 5184 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.107094 5184 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.108941 5184 server.go:1019] "Starting client certificate rotation" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.109102 5184 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.109200 5184 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.138816 5184 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.144235 5184 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.144819 5184 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.157886 5184 log.go:25] "Validated CRI v1 runtime API" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.217030 5184 log.go:25] "Validated CRI v1 image API" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.219225 5184 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.225058 5184 fs.go:135] Filesystem UUIDs: map[19e76f87-96b8-4794-9744-0b33dca22d5b:/dev/vda3 2026-03-12-16-44-16-00:/dev/sr0 5eb7c122-420e-4494-80ec-41664070d7b6:/dev/vda4 7B77-95E7:/dev/vda2] Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.225103 5184 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:46 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.248872 5184 manager.go:217] Machine: {Timestamp:2026-03-12 16:50:58.246709353 +0000 UTC m=+0.788020732 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33649930240 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:80bc4fba336e4ca1bc9d28a8be52a356 SystemUUID:50e372b3-53c9-4d5a-992b-af3198b0aed7 BootID:3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6729986048 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6545408 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16824963072 Type:vfs Inodes:4107657 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16824967168 Type:vfs Inodes:1048576 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3364990976 Type:vfs Inodes:821531 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:46 Capacity:1073741824 Type:vfs Inodes:4107657 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0b:24:05 Speed:0 Mtu:1500} {Name:br-int MacAddress:b2:a9:9f:57:07:84 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0b:24:05 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:33:8a:b2 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:8a:74:24 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2b:50:96 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7f:58:41 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:30:d5:4b Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ea:f9:9b:1c:ee:de Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:72:1d:e8:2c:a1:98 Speed:0 Mtu:1500} {Name:tap0 MacAddress:5a:94:ef:e4:0c:ee Speed:10 Mtu:1500}] Topology:[{Id:0 Memory:33649930240 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.249137 5184 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.249462 5184 manager.go:233] Version: {KernelVersion:5.14.0-570.57.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20251021-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.251281 5184 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.251348 5184 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.251570 5184 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.251580 5184 container_manager_linux.go:306] "Creating device plugin manager" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.251605 5184 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.252527 5184 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.252822 5184 state_mem.go:36] "Initialized new in-memory state store" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.252987 5184 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.258559 5184 kubelet.go:491] "Attempting to sync node with API server" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.258592 5184 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.258628 5184 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.258644 5184 kubelet.go:397] "Adding apiserver pod source" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.258659 5184 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.261586 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.261586 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.261908 5184 state_checkpoint.go:81] "State checkpoint: restored pod resource state from checkpoint" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.261929 5184 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.263076 5184 state_checkpoint.go:81] "State checkpoint: restored pod resource state from checkpoint" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.263092 5184 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.268703 5184 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.5-3.rhaos4.20.gitd0ea985.el9" apiVersion="v1" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.268906 5184 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-server-current.pem" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.269425 5184 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271727 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271762 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271774 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271785 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271797 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271808 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271820 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271832 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271845 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271863 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.271884 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.272442 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.274497 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.274522 5184 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.276385 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.298812 5184 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.298895 5184 server.go:1295] "Started kubelet" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.299947 5184 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.300535 5184 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.300663 5184 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 12 16:50:58 crc systemd[1]: Started Kubernetes Kubelet. Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.302132 5184 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.302591 5184 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.302646 5184 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.305349 5184 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.305391 5184 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.306438 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.306547 5184 server.go:317] "Adding debug handlers to kubelet server" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.307417 5184 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.306316 5184 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c261cc5ef0cb2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.298850482 +0000 UTC m=+0.840161831,LastTimestamp:2026-03-12 16:50:58.298850482 +0000 UTC m=+0.840161831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.310757 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.310879 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.311170 5184 factory.go:55] Registering systemd factory Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.311227 5184 factory.go:223] Registration of the systemd container factory successfully Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.313859 5184 factory.go:153] Registering CRI-O factory Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.313882 5184 factory.go:223] Registration of the crio container factory successfully Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.313982 5184 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.314011 5184 factory.go:103] Registering Raw factory Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.314032 5184 manager.go:1196] Started watching for new ooms in manager Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.314634 5184 manager.go:319] Starting recovery of all containers Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.344665 5184 manager.go:324] Recovery completed Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.360992 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.365158 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.365235 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.365250 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.366658 5184 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.366680 5184 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.366697 5184 state_mem.go:36] "Initialized new in-memory state store" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.373177 5184 policy_none.go:49] "None policy: Start" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.373216 5184 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.373234 5184 state_mem.go:35] "Initializing new in-memory state store" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388676 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388730 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388747 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388759 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388771 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388784 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388794 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388806 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388817 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388827 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388838 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388856 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388866 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388876 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388886 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388894 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388903 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388913 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388922 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="34177974-8d82-49d2-a763-391d0df3bbd8" volumeName="kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388930 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388940 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388948 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388957 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388966 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388975 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388983 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.388991 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389001 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389012 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389020 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389029 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389039 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389050 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389059 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a14caf222afb62aaabdc47808b6f944" volumeName="kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389067 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389075 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389083 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389091 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389100 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="34177974-8d82-49d2-a763-391d0df3bbd8" volumeName="kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389109 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389116 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389123 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389131 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389138 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389146 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389154 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389162 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f863fff9-286a-45fa-b8f0-8a86994b8440" volumeName="kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389169 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389177 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389184 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389193 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389200 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389209 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389219 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" volumeName="kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389226 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389237 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389250 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389257 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" volumeName="kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389266 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389277 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389285 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389293 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389300 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389308 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389316 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20c5c5b4bed930554494851fe3cb2b2a" volumeName="kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389324 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389332 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389339 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389347 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389355 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" volumeName="kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389364 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389389 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ee8fbd3-1f81-4666-96da-5afc70819f1a" volumeName="kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389400 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389411 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389422 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e093be35-bb62-4843-b2e8-094545761610" volumeName="kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389433 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0effdbcf-dd7d-404d-9d48-77536d665a5d" volumeName="kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389444 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389456 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389466 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389477 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389489 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389500 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" volumeName="kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389511 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389522 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389532 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389544 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389555 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389569 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389581 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389592 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389603 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389617 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389628 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389664 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389676 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389687 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a14caf222afb62aaabdc47808b6f944" volumeName="kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389698 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ee8fbd3-1f81-4666-96da-5afc70819f1a" volumeName="kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389708 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389716 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389725 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389734 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389742 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389751 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389759 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42a11a02-47e1-488f-b270-2679d3298b0e" volumeName="kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389767 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389776 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389784 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389793 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389801 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7e2c886-118e-43bb-bef1-c78134de392b" volumeName="kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389809 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389817 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389825 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389844 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389853 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389861 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389870 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42a11a02-47e1-488f-b270-2679d3298b0e" volumeName="kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389879 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389886 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389894 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389902 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389910 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389917 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389927 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389935 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389943 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389951 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389959 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389967 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7e2c886-118e-43bb-bef1-c78134de392b" volumeName="kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389974 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389981 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389988 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.389995 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5ebfebf6-3ecd-458e-943f-bb25b52e2718" volumeName="kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390002 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390011 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a208c9c2-333b-4b4a-be0d-bc32ec38a821" volumeName="kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390020 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390028 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390035 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390044 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390052 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390060 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390068 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390076 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390085 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390094 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390101 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390109 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390116 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390123 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="428b39f5-eb1c-4f65-b7a4-eeb6e84860cc" volumeName="kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390132 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" volumeName="kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390139 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" volumeName="kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390148 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390157 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b638b8f4bb0070e40528db779baf6a2" volumeName="kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390166 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390175 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390185 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390195 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390203 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390212 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390221 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390230 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390238 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390247 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390256 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af41de71-79cf-4590-bbe9-9e8b848862cb" volumeName="kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390266 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390274 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390283 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390292 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390301 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390310 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390319 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390327 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.390337 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394201 5184 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1264ac67579ad07e7e9003054d44fe40dd55285a4b2f7dc74e48be1aee0868a/globalmount" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394236 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a208c9c2-333b-4b4a-be0d-bc32ec38a821" volumeName="kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394252 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394264 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394276 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394291 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394304 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394317 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394329 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394344 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394357 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394392 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394407 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394419 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394433 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394446 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394458 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" volumeName="kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394470 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="428b39f5-eb1c-4f65-b7a4-eeb6e84860cc" volumeName="kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394482 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394497 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394515 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394527 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394538 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394551 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394568 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394581 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394595 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394607 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394619 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394631 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394646 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394677 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394691 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394704 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" volumeName="kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394718 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394730 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394742 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394754 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394766 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394777 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394790 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394805 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394819 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394832 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394844 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394857 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394870 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394884 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394896 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394911 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394925 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17b87002-b798-480a-8e17-83053d698239" volumeName="kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394939 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394953 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.394993 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395007 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395020 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395033 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395046 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395058 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395072 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395083 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395097 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395110 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395125 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395137 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395149 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395160 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5ebfebf6-3ecd-458e-943f-bb25b52e2718" volumeName="kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395172 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395185 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395197 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395207 5184 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token" seLinuxMountContext="" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395218 5184 reconstruct.go:97] "Volume reconstruction finished" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.395226 5184 reconciler.go:26] "Reconciler: start to sync state" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.396421 5184 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.398423 5184 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.398463 5184 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.398485 5184 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.398496 5184 kubelet.go:2451] "Starting kubelet main sync loop" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.398641 5184 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.400674 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.406509 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.433770 5184 manager.go:341] "Starting Device Plugin manager" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.433833 5184 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.433849 5184 server.go:85] "Starting device plugin registration server" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.434354 5184 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.434424 5184 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.434598 5184 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.434677 5184 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.434689 5184 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.439755 5184 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.439843 5184 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.499215 5184 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.499453 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.500346 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.500429 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.500447 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.501580 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.501660 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.501730 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.502230 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.502256 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.502268 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.503061 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.503094 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.503106 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.503783 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.503989 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.504011 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.504894 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.504942 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.504955 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.505849 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.505891 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.505902 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.507902 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.508225 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.508253 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.508769 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.508790 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.508802 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.509589 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.509845 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.509881 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.510176 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.510194 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.510205 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.510884 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.510911 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.511207 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.511231 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.511242 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.511239 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.511345 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.511366 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.511874 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.512216 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.512236 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.512245 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.524258 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.529764 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.536672 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.537535 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.537580 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.537595 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.537626 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.538209 5184 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.545307 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.563593 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.569411 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.597978 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run-kubernetes\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598039 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598060 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598075 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598108 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598129 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598151 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598191 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598215 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598230 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598261 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598277 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598292 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598306 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598336 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598355 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598390 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598407 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598424 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598439 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598475 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598490 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.598508 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.599313 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run-kubernetes\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.599704 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.599734 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.599947 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.600191 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.600262 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.600676 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.700664 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.700727 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.700755 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.700817 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.700850 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.700885 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.700912 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.700940 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.700940 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701025 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701075 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701134 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701153 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701182 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701190 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.700967 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701234 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701242 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701276 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701297 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701306 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701338 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701365 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701455 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701437 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701510 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701509 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701564 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701608 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701339 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701644 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.701687 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.738562 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.739766 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.739846 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.739873 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.739917 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.740487 5184 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.825946 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.831704 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.846314 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.864906 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.870203 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.875846 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0bc7fcb0822a2c13eb2d22cd8c0641.slice/crio-b909f0c014fbfff210160b2ce673236c81a4f06ab5115632d8312354d9e8b7e1 WatchSource:0}: Error finding container b909f0c014fbfff210160b2ce673236c81a4f06ab5115632d8312354d9e8b7e1: Status 404 returned error can't find the container with id b909f0c014fbfff210160b2ce673236c81a4f06ab5115632d8312354d9e8b7e1 Mar 12 16:50:58 crc kubenswrapper[5184]: I0312 16:50:58.881963 5184 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.882488 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b638b8f4bb0070e40528db779baf6a2.slice/crio-8858d40584bc49af9c2e9160eda742c1f8d6a9031193c85f98fc6a9427662059 WatchSource:0}: Error finding container 8858d40584bc49af9c2e9160eda742c1f8d6a9031193c85f98fc6a9427662059: Status 404 returned error can't find the container with id 8858d40584bc49af9c2e9160eda742c1f8d6a9031193c85f98fc6a9427662059 Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.893430 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c5c5b4bed930554494851fe3cb2b2a.slice/crio-2c24c34fc3d0eedfe2e7000b3fcf402c0e414a0e18ee9ba9079dfae1879c243d WatchSource:0}: Error finding container 2c24c34fc3d0eedfe2e7000b3fcf402c0e414a0e18ee9ba9079dfae1879c243d: Status 404 returned error can't find the container with id 2c24c34fc3d0eedfe2e7000b3fcf402c0e414a0e18ee9ba9079dfae1879c243d Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.900145 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a14caf222afb62aaabdc47808b6f944.slice/crio-9d1367b28c2fdfacdc8b9cc7f6614225491c1c8466dcaea4c2fb072427887f66 WatchSource:0}: Error finding container 9d1367b28c2fdfacdc8b9cc7f6614225491c1c8466dcaea4c2fb072427887f66: Status 404 returned error can't find the container with id 9d1367b28c2fdfacdc8b9cc7f6614225491c1c8466dcaea4c2fb072427887f66 Mar 12 16:50:58 crc kubenswrapper[5184]: W0312 16:50:58.901557 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e08c320b1e9e2405e6e0107bdf7eeb4.slice/crio-e88acc26a97bfdb8f398c5dfd3d24eac936afa9aad32804b4612e69369b7abb7 WatchSource:0}: Error finding container e88acc26a97bfdb8f398c5dfd3d24eac936afa9aad32804b4612e69369b7abb7: Status 404 returned error can't find the container with id e88acc26a97bfdb8f398c5dfd3d24eac936afa9aad32804b4612e69369b7abb7 Mar 12 16:50:58 crc kubenswrapper[5184]: E0312 16:50:58.912262 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 12 16:50:59 crc kubenswrapper[5184]: E0312 16:50:59.086597 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.141225 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.142715 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.142758 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.142768 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.142793 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:50:59 crc kubenswrapper[5184]: E0312 16:50:59.143256 5184 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.277788 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 16:50:59 crc kubenswrapper[5184]: E0312 16:50:59.372477 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.401990 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerStarted","Data":"e88acc26a97bfdb8f398c5dfd3d24eac936afa9aad32804b4612e69369b7abb7"} Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.402759 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"9d1367b28c2fdfacdc8b9cc7f6614225491c1c8466dcaea4c2fb072427887f66"} Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.403736 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"2c24c34fc3d0eedfe2e7000b3fcf402c0e414a0e18ee9ba9079dfae1879c243d"} Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.404365 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"8858d40584bc49af9c2e9160eda742c1f8d6a9031193c85f98fc6a9427662059"} Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.405191 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"b909f0c014fbfff210160b2ce673236c81a4f06ab5115632d8312354d9e8b7e1"} Mar 12 16:50:59 crc kubenswrapper[5184]: E0312 16:50:59.663948 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 16:50:59 crc kubenswrapper[5184]: E0312 16:50:59.713059 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 12 16:50:59 crc kubenswrapper[5184]: E0312 16:50:59.854695 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.943629 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.945197 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.945237 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.945253 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:50:59 crc kubenswrapper[5184]: I0312 16:50:59.945281 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:50:59 crc kubenswrapper[5184]: E0312 16:50:59.945864 5184 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.192957 5184 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 12 16:51:00 crc kubenswrapper[5184]: E0312 16:51:00.194249 5184 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.277589 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.409718 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"d25459743a8e939a6fc0c89681dd4be8f2dbe697494adb6502228b2569ba616f"} Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.409794 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1"} Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.411039 5184 generic.go:358] "Generic (PLEG): container finished" podID="4e08c320b1e9e2405e6e0107bdf7eeb4" containerID="19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a" exitCode=0 Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.411110 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerDied","Data":"19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a"} Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.411184 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.412179 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.412205 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.412216 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:00 crc kubenswrapper[5184]: E0312 16:51:00.412399 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.412885 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328" exitCode=0 Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.412968 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.412947 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328"} Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.413679 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.413709 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.413721 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:00 crc kubenswrapper[5184]: E0312 16:51:00.413882 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.414905 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.415080 5184 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0" exitCode=0 Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.415161 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0"} Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.415217 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.415326 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.415593 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.415605 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.416423 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.416456 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.416469 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:00 crc kubenswrapper[5184]: E0312 16:51:00.416630 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.418094 5184 generic.go:358] "Generic (PLEG): container finished" podID="0b638b8f4bb0070e40528db779baf6a2" containerID="ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313" exitCode=0 Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.418139 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerDied","Data":"ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313"} Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.418217 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.419081 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.419104 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:00 crc kubenswrapper[5184]: I0312 16:51:00.419113 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:00 crc kubenswrapper[5184]: E0312 16:51:00.419243 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.277505 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 16:51:01 crc kubenswrapper[5184]: E0312 16:51:01.313863 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.423210 5184 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e" exitCode=0 Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.423282 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.423485 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.424364 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.424404 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.424415 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:01 crc kubenswrapper[5184]: E0312 16:51:01.424611 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.427549 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"2bfb723f8c449cda9730d31e02d633c5bc26368677283970a7d7977e8b14823c"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.427585 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"ca237a33c864a01251750a3a9498ffbf76c02d7c465fec64514b916084eb3a4e"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.427599 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"cb2f9a89582795adf9b1f2e114f29fbcca43ccc0ac07b56136f3c09a99af6c2e"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.427616 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.428131 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.428162 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.428173 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:01 crc kubenswrapper[5184]: E0312 16:51:01.428339 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.430640 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"1b42bc08c18390a5c946037634004c7dcb6cb14b92f56220260d8237aeedd629"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.430673 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"a1d4060b0813ec05d1dff25751605cd6ce575df8a1a0788b3331780009447967"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.430821 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.432689 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.432719 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.432730 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:01 crc kubenswrapper[5184]: E0312 16:51:01.432961 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.435157 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerStarted","Data":"a528e6a13bd818b7d3bb0ef864934913eb0b6b9e7573f8f7840799a03c87c0b4"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.435532 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.436041 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.436074 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.436105 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:01 crc kubenswrapper[5184]: E0312 16:51:01.436286 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:01 crc kubenswrapper[5184]: E0312 16:51:01.437450 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.439701 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.439807 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.439897 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.439956 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5"} Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.546491 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.548879 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.550240 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.550282 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.550311 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:51:01 crc kubenswrapper[5184]: E0312 16:51:01.550956 5184 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 12 16:51:01 crc kubenswrapper[5184]: I0312 16:51:01.558263 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:51:02 crc kubenswrapper[5184]: E0312 16:51:02.026368 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 16:51:02 crc kubenswrapper[5184]: E0312 16:51:02.251235 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.277952 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.445755 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"2b426fdc267ae84b9325d988ded202ce5e05b47fe9f1deb14507508b74d442db"} Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.445865 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.446365 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.446414 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.446425 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:02 crc kubenswrapper[5184]: E0312 16:51:02.446686 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.447905 5184 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156" exitCode=0 Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.448054 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.448088 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156"} Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.448101 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.448262 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.448578 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.448611 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.448625 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.448939 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:02 crc kubenswrapper[5184]: E0312 16:51:02.449028 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.449804 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.449827 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.449839 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.449841 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.449877 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.449894 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.449811 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:02 crc kubenswrapper[5184]: E0312 16:51:02.450038 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.450052 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.450062 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:02 crc kubenswrapper[5184]: E0312 16:51:02.450429 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:02 crc kubenswrapper[5184]: E0312 16:51:02.450545 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:02 crc kubenswrapper[5184]: I0312 16:51:02.472549 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.456214 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"850d2c82f02982ba13abc9b9365f5be589329d37001cd14054004a85c6d2e96a"} Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.456667 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"7901646b9904fb6a100644a0aacd978a71373a764eea536a29abd51530037c97"} Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.456693 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"e7c8113246b3ae53a45615d0812c94a8ef10af18f75a421afdf0afa3ecb09223"} Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.456535 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.456335 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.457219 5184 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.457269 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.457913 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.457976 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.457995 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.457919 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.458172 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.458260 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:03 crc kubenswrapper[5184]: E0312 16:51:03.458538 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:03 crc kubenswrapper[5184]: E0312 16:51:03.458837 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.459203 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.459276 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.459298 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:03 crc kubenswrapper[5184]: E0312 16:51:03.460293 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:03 crc kubenswrapper[5184]: I0312 16:51:03.465432 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.312718 5184 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.463766 5184 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.463811 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.464103 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.464217 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"7696ac1bb100db5ea88c53ca29f38064d11d2600b968872de07c222ad6411720"} Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.464244 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"ad74805d602978f7f836e80d83b2ef81f7c4cbbc65155a2268057928abb2f906"} Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.464629 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.464649 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.464657 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:04 crc kubenswrapper[5184]: E0312 16:51:04.464910 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.465279 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.465292 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.465301 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:04 crc kubenswrapper[5184]: E0312 16:51:04.465446 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.751692 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.753449 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.753524 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.753548 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:04 crc kubenswrapper[5184]: I0312 16:51:04.753591 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.466123 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.466849 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.466900 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.466917 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:05 crc kubenswrapper[5184]: E0312 16:51:05.467403 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.536868 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.537212 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.538268 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.538328 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.538343 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:05 crc kubenswrapper[5184]: E0312 16:51:05.538763 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.544890 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.784328 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.784539 5184 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.784575 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.785277 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.785416 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.785579 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:05 crc kubenswrapper[5184]: E0312 16:51:05.786015 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:05 crc kubenswrapper[5184]: I0312 16:51:05.865782 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.007577 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-etcd/etcd-crc" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.190781 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.365143 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.469157 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.469205 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.469274 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.474771 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.474841 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.474874 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.474952 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.475023 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.475057 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.475371 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.475686 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:06 crc kubenswrapper[5184]: I0312 16:51:06.475778 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:06 crc kubenswrapper[5184]: E0312 16:51:06.476106 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:06 crc kubenswrapper[5184]: E0312 16:51:06.476347 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:06 crc kubenswrapper[5184]: E0312 16:51:06.477691 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:07 crc kubenswrapper[5184]: I0312 16:51:07.472716 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:07 crc kubenswrapper[5184]: I0312 16:51:07.473942 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:07 crc kubenswrapper[5184]: I0312 16:51:07.474136 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:07 crc kubenswrapper[5184]: I0312 16:51:07.474280 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:07 crc kubenswrapper[5184]: E0312 16:51:07.474923 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:08 crc kubenswrapper[5184]: E0312 16:51:08.440111 5184 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 16:51:08 crc kubenswrapper[5184]: I0312 16:51:08.497509 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 12 16:51:08 crc kubenswrapper[5184]: I0312 16:51:08.497805 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:08 crc kubenswrapper[5184]: I0312 16:51:08.498663 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:08 crc kubenswrapper[5184]: I0312 16:51:08.498740 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:08 crc kubenswrapper[5184]: I0312 16:51:08.498761 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:08 crc kubenswrapper[5184]: E0312 16:51:08.499481 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:08 crc kubenswrapper[5184]: I0312 16:51:08.865860 5184 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": context deadline exceeded" start-of-body= Mar 12 16:51:08 crc kubenswrapper[5184]: I0312 16:51:08.865953 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": context deadline exceeded" Mar 12 16:51:11 crc kubenswrapper[5184]: I0312 16:51:11.805123 5184 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 12 16:51:11 crc kubenswrapper[5184]: I0312 16:51:11.805255 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 12 16:51:13 crc kubenswrapper[5184]: I0312 16:51:13.019443 5184 trace.go:236] Trace[1708181888]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Mar-2026 16:51:03.018) (total time: 10001ms): Mar 12 16:51:13 crc kubenswrapper[5184]: Trace[1708181888]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:51:13.019) Mar 12 16:51:13 crc kubenswrapper[5184]: Trace[1708181888]: [10.001077295s] [10.001077295s] END Mar 12 16:51:13 crc kubenswrapper[5184]: E0312 16:51:13.019495 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 16:51:13 crc kubenswrapper[5184]: I0312 16:51:13.277989 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 12 16:51:13 crc kubenswrapper[5184]: I0312 16:51:13.979786 5184 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 16:51:13 crc kubenswrapper[5184]: I0312 16:51:13.979866 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 12 16:51:13 crc kubenswrapper[5184]: I0312 16:51:13.987918 5184 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 16:51:13 crc kubenswrapper[5184]: I0312 16:51:13.988028 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 12 16:51:14 crc kubenswrapper[5184]: E0312 16:51:14.515780 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Mar 12 16:51:15 crc kubenswrapper[5184]: I0312 16:51:15.793549 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:51:15 crc kubenswrapper[5184]: I0312 16:51:15.793863 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:15 crc kubenswrapper[5184]: I0312 16:51:15.796088 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:15 crc kubenswrapper[5184]: I0312 16:51:15.796159 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:15 crc kubenswrapper[5184]: I0312 16:51:15.796179 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:15 crc kubenswrapper[5184]: E0312 16:51:15.796766 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:15 crc kubenswrapper[5184]: I0312 16:51:15.801838 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:51:16 crc kubenswrapper[5184]: I0312 16:51:16.496519 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:16 crc kubenswrapper[5184]: I0312 16:51:16.497372 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:16 crc kubenswrapper[5184]: I0312 16:51:16.497648 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:16 crc kubenswrapper[5184]: I0312 16:51:16.497873 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:16 crc kubenswrapper[5184]: E0312 16:51:16.498696 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:17 crc kubenswrapper[5184]: I0312 16:51:17.479845 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:51:17 crc kubenswrapper[5184]: I0312 16:51:17.480028 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:17 crc kubenswrapper[5184]: I0312 16:51:17.480888 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:17 crc kubenswrapper[5184]: I0312 16:51:17.480933 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:17 crc kubenswrapper[5184]: I0312 16:51:17.480951 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:17 crc kubenswrapper[5184]: E0312 16:51:17.481306 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.440575 5184 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.539028 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.539442 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.540641 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.540687 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.540707 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.541287 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.559005 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.788170 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.866556 5184 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": context deadline exceeded" start-of-body= Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.866678 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": context deadline exceeded" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.986063 5184 trace.go:236] Trace[1174279839]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Mar-2026 16:51:05.460) (total time: 13525ms): Mar 12 16:51:18 crc kubenswrapper[5184]: Trace[1174279839]: ---"Objects listed" error:runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope 13525ms (16:51:18.985) Mar 12 16:51:18 crc kubenswrapper[5184]: Trace[1174279839]: [13.525589357s] [13.525589357s] END Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.986115 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.986433 5184 trace.go:236] Trace[346428070]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Mar-2026 16:51:06.776) (total time: 12209ms): Mar 12 16:51:18 crc kubenswrapper[5184]: Trace[346428070]: ---"Objects listed" error:nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope 12209ms (16:51:18.986) Mar 12 16:51:18 crc kubenswrapper[5184]: Trace[346428070]: [12.209413506s] [12.209413506s] END Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.986966 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.988070 5184 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.988148 5184 trace.go:236] Trace[1575576246]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Mar-2026 16:51:05.556) (total time: 13431ms): Mar 12 16:51:18 crc kubenswrapper[5184]: Trace[1575576246]: ---"Objects listed" error:csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope 13431ms (16:51:18.988) Mar 12 16:51:18 crc kubenswrapper[5184]: Trace[1575576246]: [13.431470507s] [13.431470507s] END Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.988421 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.988288 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc5ef0cb2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.298850482 +0000 UTC m=+0.840161831,LastTimestamp:2026-03-12 16:50:58.298850482 +0000 UTC m=+0.840161831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.993066 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e3cc2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365221932 +0000 UTC m=+0.906533281,LastTimestamp:2026-03-12 16:50:58.365221932 +0000 UTC m=+0.906533281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:18 crc kubenswrapper[5184]: I0312 16:51:18.994269 5184 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.997292 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e41efb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365243131 +0000 UTC m=+0.906554470,LastTimestamp:2026-03-12 16:50:58.365243131 +0000 UTC m=+0.906554470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:18 crc kubenswrapper[5184]: E0312 16:51:18.998989 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e44cb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365254841 +0000 UTC m=+0.906566170,LastTimestamp:2026-03-12 16:50:58.365254841 +0000 UTC m=+0.906566170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.005603 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cce2cb1c1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.437108161 +0000 UTC m=+0.978419500,LastTimestamp:2026-03-12 16:50:58.437108161 +0000 UTC m=+0.978419500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.019034 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e3cc2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e3cc2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365221932 +0000 UTC m=+0.906533281,LastTimestamp:2026-03-12 16:50:58.500408077 +0000 UTC m=+1.041719416,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.026766 5184 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35376->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.026846 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35376->192.168.126.11:17697: read: connection reset by peer" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.027258 5184 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.027338 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.031241 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e41efb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e41efb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365243131 +0000 UTC m=+0.906554470,LastTimestamp:2026-03-12 16:50:58.500438585 +0000 UTC m=+1.041749924,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.057335 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e44cb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e44cb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365254841 +0000 UTC m=+0.906566170,LastTimestamp:2026-03-12 16:50:58.500452745 +0000 UTC m=+1.041764094,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.067974 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e3cc2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e3cc2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365221932 +0000 UTC m=+0.906533281,LastTimestamp:2026-03-12 16:50:58.502247827 +0000 UTC m=+1.043559166,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.073954 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e41efb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e41efb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365243131 +0000 UTC m=+0.906554470,LastTimestamp:2026-03-12 16:50:58.502263107 +0000 UTC m=+1.043574446,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.078095 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e44cb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e44cb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365254841 +0000 UTC m=+0.906566170,LastTimestamp:2026-03-12 16:50:58.502274826 +0000 UTC m=+1.043586165,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.082753 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e3cc2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e3cc2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365221932 +0000 UTC m=+0.906533281,LastTimestamp:2026-03-12 16:50:58.503080735 +0000 UTC m=+1.044392074,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.087777 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e41efb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e41efb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365243131 +0000 UTC m=+0.906554470,LastTimestamp:2026-03-12 16:50:58.503100084 +0000 UTC m=+1.044411413,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.092851 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e44cb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e44cb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365254841 +0000 UTC m=+0.906566170,LastTimestamp:2026-03-12 16:50:58.503110334 +0000 UTC m=+1.044421673,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.098299 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e3cc2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e3cc2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365221932 +0000 UTC m=+0.906533281,LastTimestamp:2026-03-12 16:50:58.504929335 +0000 UTC m=+1.046240674,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.103246 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e41efb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e41efb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365243131 +0000 UTC m=+0.906554470,LastTimestamp:2026-03-12 16:50:58.504949815 +0000 UTC m=+1.046261154,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.107668 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e44cb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e44cb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365254841 +0000 UTC m=+0.906566170,LastTimestamp:2026-03-12 16:50:58.504961214 +0000 UTC m=+1.046272553,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.112703 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e3cc2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e3cc2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365221932 +0000 UTC m=+0.906533281,LastTimestamp:2026-03-12 16:50:58.505872239 +0000 UTC m=+1.047183578,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.117361 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e41efb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e41efb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365243131 +0000 UTC m=+0.906554470,LastTimestamp:2026-03-12 16:50:58.505897218 +0000 UTC m=+1.047208557,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.122252 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e44cb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e44cb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365254841 +0000 UTC m=+0.906566170,LastTimestamp:2026-03-12 16:50:58.505906188 +0000 UTC m=+1.047217527,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.127554 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e3cc2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e3cc2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365221932 +0000 UTC m=+0.906533281,LastTimestamp:2026-03-12 16:50:58.508782868 +0000 UTC m=+1.050094207,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.132057 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e41efb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e41efb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365243131 +0000 UTC m=+0.906554470,LastTimestamp:2026-03-12 16:50:58.508796908 +0000 UTC m=+1.050108247,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.139523 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e44cb9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e44cb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365254841 +0000 UTC m=+0.906566170,LastTimestamp:2026-03-12 16:50:58.508808308 +0000 UTC m=+1.050119647,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.145618 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e3cc2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e3cc2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365221932 +0000 UTC m=+0.906533281,LastTimestamp:2026-03-12 16:50:58.510187936 +0000 UTC m=+1.051499275,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.151316 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c261cc9e41efb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c261cc9e41efb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.365243131 +0000 UTC m=+0.906554470,LastTimestamp:2026-03-12 16:50:58.510199625 +0000 UTC m=+1.051510964,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.158714 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261ce8ba121e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.882581022 +0000 UTC m=+1.423892361,LastTimestamp:2026-03-12 16:50:58.882581022 +0000 UTC m=+1.423892361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.163163 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261ce960de4b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.893512267 +0000 UTC m=+1.434823626,LastTimestamp:2026-03-12 16:50:58.893512267 +0000 UTC m=+1.434823626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.169740 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261ce9a2be38 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.897829432 +0000 UTC m=+1.439140771,LastTimestamp:2026-03-12 16:50:58.897829432 +0000 UTC m=+1.439140771,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.175852 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261cea13b750 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.905233232 +0000 UTC m=+1.446544581,LastTimestamp:2026-03-12 16:50:58.905233232 +0000 UTC m=+1.446544581,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.179623 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c261cea2cefe6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:58.906886118 +0000 UTC m=+1.448197457,LastTimestamp:2026-03-12 16:50:58.906886118 +0000 UTC m=+1.448197457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.183622 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d136833f3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container: wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.598636019 +0000 UTC m=+2.139947348,LastTimestamp:2026-03-12 16:50:59.598636019 +0000 UTC m=+2.139947348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.192671 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c261d136c0118 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.598885144 +0000 UTC m=+2.140196483,LastTimestamp:2026-03-12 16:50:59.598885144 +0000 UTC m=+2.140196483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.197735 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d136d3a4b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.598965323 +0000 UTC m=+2.140276662,LastTimestamp:2026-03-12 16:50:59.598965323 +0000 UTC m=+2.140276662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.198851 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d136e0d44 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.599019332 +0000 UTC m=+2.140330671,LastTimestamp:2026-03-12 16:50:59.599019332 +0000 UTC m=+2.140330671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.202070 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261d137622ff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.599549183 +0000 UTC m=+2.140860522,LastTimestamp:2026-03-12 16:50:59.599549183 +0000 UTC m=+2.140860522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.205859 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d1413e1a8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.609887144 +0000 UTC m=+2.151198483,LastTimestamp:2026-03-12 16:50:59.609887144 +0000 UTC m=+2.151198483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.210261 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d14834eb9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.617189561 +0000 UTC m=+2.158500900,LastTimestamp:2026-03-12 16:50:59.617189561 +0000 UTC m=+2.158500900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.215863 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c261d149bb6d0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.618789072 +0000 UTC m=+2.160100411,LastTimestamp:2026-03-12 16:50:59.618789072 +0000 UTC m=+2.160100411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.219930 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d149be4fc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.618800892 +0000 UTC m=+2.160112231,LastTimestamp:2026-03-12 16:50:59.618800892 +0000 UTC m=+2.160112231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.224264 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261d149c4423 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.618825251 +0000 UTC m=+2.160136590,LastTimestamp:2026-03-12 16:50:59.618825251 +0000 UTC m=+2.160136590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.228856 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d14a57f9d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.619430301 +0000 UTC m=+2.160741640,LastTimestamp:2026-03-12 16:50:59.619430301 +0000 UTC m=+2.160741640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.233320 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d28215832 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.946313778 +0000 UTC m=+2.487625137,LastTimestamp:2026-03-12 16:50:59.946313778 +0000 UTC m=+2.487625137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.237955 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d29271dfa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.963469306 +0000 UTC m=+2.504780665,LastTimestamp:2026-03-12 16:50:59.963469306 +0000 UTC m=+2.504780665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.241691 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d2937a760 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:50:59.964553056 +0000 UTC m=+2.505864405,LastTimestamp:2026-03-12 16:50:59.964553056 +0000 UTC m=+2.505864405,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.245692 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c261d43fda4f1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.413736177 +0000 UTC m=+2.955047536,LastTimestamp:2026-03-12 16:51:00.413736177 +0000 UTC m=+2.955047536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.252203 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d440dbd61 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.414791009 +0000 UTC m=+2.956102348,LastTimestamp:2026-03-12 16:51:00.414791009 +0000 UTC m=+2.956102348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.259067 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261d4439856a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.417660266 +0000 UTC m=+2.958971685,LastTimestamp:2026-03-12 16:51:00.417660266 +0000 UTC m=+2.958971685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.265360 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d4476e73b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.421683003 +0000 UTC m=+2.962994342,LastTimestamp:2026-03-12 16:51:00.421683003 +0000 UTC m=+2.962994342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.271437 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d4829f053 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container: kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.483747923 +0000 UTC m=+3.025059262,LastTimestamp:2026-03-12 16:51:00.483747923 +0000 UTC m=+3.025059262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.276128 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d4923e032 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.500127794 +0000 UTC m=+3.041439133,LastTimestamp:2026-03-12 16:51:00.500127794 +0000 UTC m=+3.041439133,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.282681 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.282885 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d4939d06d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.501565549 +0000 UTC m=+3.042876878,LastTimestamp:2026-03-12 16:51:00.501565549 +0000 UTC m=+3.042876878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.288332 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c261d536cba76 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.672674422 +0000 UTC m=+3.213985761,LastTimestamp:2026-03-12 16:51:00.672674422 +0000 UTC m=+3.213985761,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.292843 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d537df71e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.673804062 +0000 UTC m=+3.215115401,LastTimestamp:2026-03-12 16:51:00.673804062 +0000 UTC m=+3.215115401,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.298975 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d548ce9c3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.691560899 +0000 UTC m=+3.232872238,LastTimestamp:2026-03-12 16:51:00.691560899 +0000 UTC m=+3.232872238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.304934 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d54a2982b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.692981803 +0000 UTC m=+3.234293142,LastTimestamp:2026-03-12 16:51:00.692981803 +0000 UTC m=+3.234293142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.311847 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c261d54a65272 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.693226098 +0000 UTC m=+3.234537437,LastTimestamp:2026-03-12 16:51:00.693226098 +0000 UTC m=+3.234537437,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.316490 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d54a789cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.693305807 +0000 UTC m=+3.234617146,LastTimestamp:2026-03-12 16:51:00.693305807 +0000 UTC m=+3.234617146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.324795 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261d54acb511 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container: etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.693644561 +0000 UTC m=+3.234955900,LastTimestamp:2026-03-12 16:51:00.693644561 +0000 UTC m=+3.234955900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.333078 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d555f6352 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container: kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.705354578 +0000 UTC m=+3.246665917,LastTimestamp:2026-03-12 16:51:00.705354578 +0000 UTC m=+3.246665917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.339116 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d55ab7b79 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.710341497 +0000 UTC m=+3.251652836,LastTimestamp:2026-03-12 16:51:00.710341497 +0000 UTC m=+3.251652836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.343303 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d55bb8eb6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.711394998 +0000 UTC m=+3.252706337,LastTimestamp:2026-03-12 16:51:00.711394998 +0000 UTC m=+3.252706337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.347175 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261d56ac387c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.7271671 +0000 UTC m=+3.268478439,LastTimestamp:2026-03-12 16:51:00.7271671 +0000 UTC m=+3.268478439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.351188 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261d56af17e7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.727355367 +0000 UTC m=+3.268666726,LastTimestamp:2026-03-12 16:51:00.727355367 +0000 UTC m=+3.268666726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.355040 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d5f6c71dc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container: kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.873982428 +0000 UTC m=+3.415293766,LastTimestamp:2026-03-12 16:51:00.873982428 +0000 UTC m=+3.415293766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.358729 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d606bc281 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.890714753 +0000 UTC m=+3.432026092,LastTimestamp:2026-03-12 16:51:00.890714753 +0000 UTC m=+3.432026092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.362975 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d60795b38 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.891605816 +0000 UTC m=+3.432917155,LastTimestamp:2026-03-12 16:51:00.891605816 +0000 UTC m=+3.432917155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.368248 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d609f515c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container: kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.89409366 +0000 UTC m=+3.435404999,LastTimestamp:2026-03-12 16:51:00.89409366 +0000 UTC m=+3.435404999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.372829 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d61f7ca2a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.91666897 +0000 UTC m=+3.457980309,LastTimestamp:2026-03-12 16:51:00.91666897 +0000 UTC m=+3.457980309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.377284 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d62084468 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:00.91774884 +0000 UTC m=+3.459060179,LastTimestamp:2026-03-12 16:51:00.91774884 +0000 UTC m=+3.459060179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.381317 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d6b39c982 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container: kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.071989122 +0000 UTC m=+3.613300461,LastTimestamp:2026-03-12 16:51:01.071989122 +0000 UTC m=+3.613300461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.387311 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c261d6c04bb99 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.085289369 +0000 UTC m=+3.626600708,LastTimestamp:2026-03-12 16:51:01.085289369 +0000 UTC m=+3.626600708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.392042 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d6ea92b30 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container: kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.129620272 +0000 UTC m=+3.670931611,LastTimestamp:2026-03-12 16:51:01.129620272 +0000 UTC m=+3.670931611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.399104 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d6f82f31a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.143892762 +0000 UTC m=+3.685204121,LastTimestamp:2026-03-12 16:51:01.143892762 +0000 UTC m=+3.685204121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.404264 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d6f938ce9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.144980713 +0000 UTC m=+3.686292052,LastTimestamp:2026-03-12 16:51:01.144980713 +0000 UTC m=+3.686292052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.410361 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d7e60bbdf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.393308639 +0000 UTC m=+3.934620018,LastTimestamp:2026-03-12 16:51:01.393308639 +0000 UTC m=+3.934620018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.414177 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261d8067c77e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.427324798 +0000 UTC m=+3.968636137,LastTimestamp:2026-03-12 16:51:01.427324798 +0000 UTC m=+3.968636137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.418101 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d809a5d8b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.430640011 +0000 UTC m=+3.971951350,LastTimestamp:2026-03-12 16:51:01.430640011 +0000 UTC m=+3.971951350,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.423884 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d80b10a14 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.432125972 +0000 UTC m=+3.973437311,LastTimestamp:2026-03-12 16:51:01.432125972 +0000 UTC m=+3.973437311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.430617 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d8fdc67d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container: kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.686626261 +0000 UTC m=+4.227937610,LastTimestamp:2026-03-12 16:51:01.686626261 +0000 UTC m=+4.227937610,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.434178 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261d901d33ef openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container: etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.690872815 +0000 UTC m=+4.232184154,LastTimestamp:2026-03-12 16:51:01.690872815 +0000 UTC m=+4.232184154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.438281 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d90dbae65 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.703356005 +0000 UTC m=+4.244667344,LastTimestamp:2026-03-12 16:51:01.703356005 +0000 UTC m=+4.244667344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.441759 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261d914cb706 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.710763782 +0000 UTC m=+4.252075121,LastTimestamp:2026-03-12 16:51:01.710763782 +0000 UTC m=+4.252075121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.449456 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dbd765843 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:02.451689539 +0000 UTC m=+4.993000918,LastTimestamp:2026-03-12 16:51:02.451689539 +0000 UTC m=+4.993000918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.453261 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dc9544f4a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:02.65078561 +0000 UTC m=+5.192096949,LastTimestamp:2026-03-12 16:51:02.65078561 +0000 UTC m=+5.192096949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.457592 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dca196ebb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:02.663704251 +0000 UTC m=+5.205015600,LastTimestamp:2026-03-12 16:51:02.663704251 +0000 UTC m=+5.205015600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.462564 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dca29c99c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:02.664776092 +0000 UTC m=+5.206087451,LastTimestamp:2026-03-12 16:51:02.664776092 +0000 UTC m=+5.206087451,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.467268 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dd911f67d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:02.914872957 +0000 UTC m=+5.456184326,LastTimestamp:2026-03-12 16:51:02.914872957 +0000 UTC m=+5.456184326,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.472478 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dda3dedeb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:02.934531563 +0000 UTC m=+5.475842892,LastTimestamp:2026-03-12 16:51:02.934531563 +0000 UTC m=+5.475842892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.478278 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dda537bbd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:02.935944125 +0000 UTC m=+5.477255504,LastTimestamp:2026-03-12 16:51:02.935944125 +0000 UTC m=+5.477255504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.482697 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dea034f0d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container: etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:03.199125261 +0000 UTC m=+5.740436600,LastTimestamp:2026-03-12 16:51:03.199125261 +0000 UTC m=+5.740436600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.486964 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261deb4b5859 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:03.220623449 +0000 UTC m=+5.761934818,LastTimestamp:2026-03-12 16:51:03.220623449 +0000 UTC m=+5.761934818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.491331 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261deb5cd379 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:03.221769081 +0000 UTC m=+5.763080420,LastTimestamp:2026-03-12 16:51:03.221769081 +0000 UTC m=+5.763080420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.497628 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dfac9ce81 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container: etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:03.480569473 +0000 UTC m=+6.021880852,LastTimestamp:2026-03-12 16:51:03.480569473 +0000 UTC m=+6.021880852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.502688 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dfbeb50ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:03.49954273 +0000 UTC m=+6.040854099,LastTimestamp:2026-03-12 16:51:03.49954273 +0000 UTC m=+6.040854099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.505181 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/0.log" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.506993 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="2b426fdc267ae84b9325d988ded202ce5e05b47fe9f1deb14507508b74d442db" exitCode=255 Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.507089 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"2b426fdc267ae84b9325d988ded202ce5e05b47fe9f1deb14507508b74d442db"} Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.507342 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.507436 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.508054 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.508087 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.508101 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.508081 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.508160 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.508185 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.508491 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.508870 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:19 crc kubenswrapper[5184]: I0312 16:51:19.509261 5184 scope.go:117] "RemoveContainer" containerID="2b426fdc267ae84b9325d988ded202ce5e05b47fe9f1deb14507508b74d442db" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.509517 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261dfc0ccb3a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:03.501736762 +0000 UTC m=+6.043048121,LastTimestamp:2026-03-12 16:51:03.501736762 +0000 UTC m=+6.043048121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.514014 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261e0a79261b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container: etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:03.743718939 +0000 UTC m=+6.285030278,LastTimestamp:2026-03-12 16:51:03.743718939 +0000 UTC m=+6.285030278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.519710 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c261e0b7ac4b5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:03.760602293 +0000 UTC m=+6.301913662,LastTimestamp:2026-03-12 16:51:03.760602293 +0000 UTC m=+6.301913662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.530775 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 16:51:19 crc kubenswrapper[5184]: &Event{ObjectMeta:{kube-controller-manager-crc.189c261f3bc7c417 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://localhost:10357/healthz": context deadline exceeded Mar 12 16:51:19 crc kubenswrapper[5184]: body: Mar 12 16:51:19 crc kubenswrapper[5184]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:08.865922071 +0000 UTC m=+11.407233440,LastTimestamp:2026-03-12 16:51:08.865922071 +0000 UTC m=+11.407233440,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 16:51:19 crc kubenswrapper[5184]: > Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.534778 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261f3bc9547c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://localhost:10357/healthz\": context deadline exceeded,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:08.866024572 +0000 UTC m=+11.407335951,LastTimestamp:2026-03-12 16:51:08.866024572 +0000 UTC m=+11.407335951,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.538819 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 16:51:19 crc kubenswrapper[5184]: &Event{ObjectMeta:{kube-apiserver-crc.189c261feaf9fb47 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 12 16:51:19 crc kubenswrapper[5184]: body: Mar 12 16:51:19 crc kubenswrapper[5184]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:11.805225799 +0000 UTC m=+14.346537178,LastTimestamp:2026-03-12 16:51:11.805225799 +0000 UTC m=+14.346537178,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 16:51:19 crc kubenswrapper[5184]: > Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.542814 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261feafb0749 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:11.805294409 +0000 UTC m=+14.346605788,LastTimestamp:2026-03-12 16:51:11.805294409 +0000 UTC m=+14.346605788,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.548575 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 16:51:19 crc kubenswrapper[5184]: &Event{ObjectMeta:{kube-apiserver-crc.189c26206c97eee1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 12 16:51:19 crc kubenswrapper[5184]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 16:51:19 crc kubenswrapper[5184]: Mar 12 16:51:19 crc kubenswrapper[5184]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:13.979838177 +0000 UTC m=+16.521149536,LastTimestamp:2026-03-12 16:51:13.979838177 +0000 UTC m=+16.521149536,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 16:51:19 crc kubenswrapper[5184]: > Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.557144 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c26206c98c61d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:13.979893277 +0000 UTC m=+16.521204636,LastTimestamp:2026-03-12 16:51:13.979893277 +0000 UTC m=+16.521204636,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.562012 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c26206c97eee1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 16:51:19 crc kubenswrapper[5184]: &Event{ObjectMeta:{kube-apiserver-crc.189c26206c97eee1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 12 16:51:19 crc kubenswrapper[5184]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 16:51:19 crc kubenswrapper[5184]: Mar 12 16:51:19 crc kubenswrapper[5184]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:13.979838177 +0000 UTC m=+16.521149536,LastTimestamp:2026-03-12 16:51:13.987992034 +0000 UTC m=+16.529303373,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 16:51:19 crc kubenswrapper[5184]: > Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.567730 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c26206c98c61d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c26206c98c61d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:13.979893277 +0000 UTC m=+16.521204636,LastTimestamp:2026-03-12 16:51:13.988059814 +0000 UTC m=+16.529371153,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.577641 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c261f3bc7c417\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 16:51:19 crc kubenswrapper[5184]: &Event{ObjectMeta:{kube-controller-manager-crc.189c261f3bc7c417 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://localhost:10357/healthz": context deadline exceeded Mar 12 16:51:19 crc kubenswrapper[5184]: body: Mar 12 16:51:19 crc kubenswrapper[5184]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:08.865922071 +0000 UTC m=+11.407233440,LastTimestamp:2026-03-12 16:51:18.866639019 +0000 UTC m=+21.407950398,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 16:51:19 crc kubenswrapper[5184]: > Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.582451 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c261f3bc9547c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c261f3bc9547c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://localhost:10357/healthz\": context deadline exceeded,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:08.866024572 +0000 UTC m=+11.407335951,LastTimestamp:2026-03-12 16:51:18.866716802 +0000 UTC m=+21.408028171,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.596787 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 16:51:19 crc kubenswrapper[5184]: &Event{ObjectMeta:{kube-apiserver-crc.189c2621996ac64f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:35376->192.168.126.11:17697: read: connection reset by peer Mar 12 16:51:19 crc kubenswrapper[5184]: body: Mar 12 16:51:19 crc kubenswrapper[5184]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:19.026820687 +0000 UTC m=+21.568132026,LastTimestamp:2026-03-12 16:51:19.026820687 +0000 UTC m=+21.568132026,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 16:51:19 crc kubenswrapper[5184]: > Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.602495 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2621996b8928 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35376->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:19.026870568 +0000 UTC m=+21.568181907,LastTimestamp:2026-03-12 16:51:19.026870568 +0000 UTC m=+21.568181907,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.607805 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 16:51:19 crc kubenswrapper[5184]: &Event{ObjectMeta:{kube-apiserver-crc.189c2621997233b2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 12 16:51:19 crc kubenswrapper[5184]: body: Mar 12 16:51:19 crc kubenswrapper[5184]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:19.027307442 +0000 UTC m=+21.568618801,LastTimestamp:2026-03-12 16:51:19.027307442 +0000 UTC m=+21.568618801,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 16:51:19 crc kubenswrapper[5184]: > Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.620734 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c262199730af9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:19.027362553 +0000 UTC m=+21.568673902,LastTimestamp:2026-03-12 16:51:19.027362553 +0000 UTC m=+21.568673902,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.629801 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c261d80b10a14\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d80b10a14 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.432125972 +0000 UTC m=+3.973437311,LastTimestamp:2026-03-12 16:51:19.515112094 +0000 UTC m=+22.056423443,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.841900 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c261d8fdc67d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d8fdc67d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container: kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.686626261 +0000 UTC m=+4.227937610,LastTimestamp:2026-03-12 16:51:19.837552042 +0000 UTC m=+22.378863391,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:19 crc kubenswrapper[5184]: E0312 16:51:19.859829 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c261d90dbae65\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d90dbae65 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.703356005 +0000 UTC m=+4.244667344,LastTimestamp:2026-03-12 16:51:19.852153423 +0000 UTC m=+22.393464762,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:20 crc kubenswrapper[5184]: I0312 16:51:20.284192 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:20 crc kubenswrapper[5184]: I0312 16:51:20.510745 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/0.log" Mar 12 16:51:20 crc kubenswrapper[5184]: I0312 16:51:20.512037 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"25d43a19faa74723e45d46c5c5efca3cc6c50d971f0ac488ccd11d1ba818dd36"} Mar 12 16:51:20 crc kubenswrapper[5184]: I0312 16:51:20.512246 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:20 crc kubenswrapper[5184]: I0312 16:51:20.512874 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:20 crc kubenswrapper[5184]: I0312 16:51:20.512910 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:20 crc kubenswrapper[5184]: I0312 16:51:20.512919 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:20 crc kubenswrapper[5184]: E0312 16:51:20.513292 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:20 crc kubenswrapper[5184]: E0312 16:51:20.917306 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.285262 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.516828 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.517308 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/0.log" Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.520118 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="25d43a19faa74723e45d46c5c5efca3cc6c50d971f0ac488ccd11d1ba818dd36" exitCode=255 Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.520174 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"25d43a19faa74723e45d46c5c5efca3cc6c50d971f0ac488ccd11d1ba818dd36"} Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.520275 5184 scope.go:117] "RemoveContainer" containerID="2b426fdc267ae84b9325d988ded202ce5e05b47fe9f1deb14507508b74d442db" Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.520466 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.521245 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.521314 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.521337 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:21 crc kubenswrapper[5184]: E0312 16:51:21.522038 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.522611 5184 scope.go:117] "RemoveContainer" containerID="25d43a19faa74723e45d46c5c5efca3cc6c50d971f0ac488ccd11d1ba818dd36" Mar 12 16:51:21 crc kubenswrapper[5184]: E0312 16:51:21.523056 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 12 16:51:21 crc kubenswrapper[5184]: E0312 16:51:21.534980 5184 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c26222e332c63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:21.522981987 +0000 UTC m=+24.064293366,LastTimestamp:2026-03-12 16:51:21.522981987 +0000 UTC m=+24.064293366,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:21 crc kubenswrapper[5184]: I0312 16:51:21.804937 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:51:22 crc kubenswrapper[5184]: I0312 16:51:22.282434 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:22 crc kubenswrapper[5184]: I0312 16:51:22.524165 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 12 16:51:22 crc kubenswrapper[5184]: I0312 16:51:22.526283 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:22 crc kubenswrapper[5184]: I0312 16:51:22.527132 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:22 crc kubenswrapper[5184]: I0312 16:51:22.527188 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:22 crc kubenswrapper[5184]: I0312 16:51:22.527207 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:22 crc kubenswrapper[5184]: E0312 16:51:22.527760 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:22 crc kubenswrapper[5184]: I0312 16:51:22.528149 5184 scope.go:117] "RemoveContainer" containerID="25d43a19faa74723e45d46c5c5efca3cc6c50d971f0ac488ccd11d1ba818dd36" Mar 12 16:51:22 crc kubenswrapper[5184]: E0312 16:51:22.528481 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 12 16:51:22 crc kubenswrapper[5184]: E0312 16:51:22.533201 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c26222e332c63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c26222e332c63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:21.522981987 +0000 UTC m=+24.064293366,LastTimestamp:2026-03-12 16:51:22.528439003 +0000 UTC m=+25.069750382,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:23 crc kubenswrapper[5184]: I0312 16:51:23.281293 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:24 crc kubenswrapper[5184]: I0312 16:51:24.283558 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.284862 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.388344 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.389501 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.389578 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.389599 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.389639 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:51:25 crc kubenswrapper[5184]: E0312 16:51:25.406083 5184 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.878355 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.878925 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.880240 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.880291 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.880310 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:25 crc kubenswrapper[5184]: E0312 16:51:25.880833 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:25 crc kubenswrapper[5184]: I0312 16:51:25.884209 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:51:26 crc kubenswrapper[5184]: I0312 16:51:26.284046 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:26 crc kubenswrapper[5184]: I0312 16:51:26.535009 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:26 crc kubenswrapper[5184]: I0312 16:51:26.536326 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:26 crc kubenswrapper[5184]: I0312 16:51:26.536415 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:26 crc kubenswrapper[5184]: I0312 16:51:26.536439 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:26 crc kubenswrapper[5184]: E0312 16:51:26.536946 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:27 crc kubenswrapper[5184]: I0312 16:51:27.284260 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:27 crc kubenswrapper[5184]: E0312 16:51:27.630899 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 16:51:27 crc kubenswrapper[5184]: E0312 16:51:27.924919 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 16:51:28 crc kubenswrapper[5184]: I0312 16:51:28.285145 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:28 crc kubenswrapper[5184]: E0312 16:51:28.440991 5184 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 16:51:29 crc kubenswrapper[5184]: I0312 16:51:29.284687 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:29 crc kubenswrapper[5184]: E0312 16:51:29.990971 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 16:51:30 crc kubenswrapper[5184]: E0312 16:51:30.157028 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 16:51:30 crc kubenswrapper[5184]: I0312 16:51:30.282746 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:30 crc kubenswrapper[5184]: E0312 16:51:30.435504 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 16:51:30 crc kubenswrapper[5184]: I0312 16:51:30.512940 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:51:30 crc kubenswrapper[5184]: I0312 16:51:30.513646 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:30 crc kubenswrapper[5184]: I0312 16:51:30.514814 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:30 crc kubenswrapper[5184]: I0312 16:51:30.514860 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:30 crc kubenswrapper[5184]: I0312 16:51:30.514870 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:30 crc kubenswrapper[5184]: E0312 16:51:30.515291 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:30 crc kubenswrapper[5184]: I0312 16:51:30.515551 5184 scope.go:117] "RemoveContainer" containerID="25d43a19faa74723e45d46c5c5efca3cc6c50d971f0ac488ccd11d1ba818dd36" Mar 12 16:51:30 crc kubenswrapper[5184]: E0312 16:51:30.525672 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c261d80b10a14\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d80b10a14 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.432125972 +0000 UTC m=+3.973437311,LastTimestamp:2026-03-12 16:51:30.517098457 +0000 UTC m=+33.058409826,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:30 crc kubenswrapper[5184]: E0312 16:51:30.802345 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c261d8fdc67d5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d8fdc67d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container: kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.686626261 +0000 UTC m=+4.227937610,LastTimestamp:2026-03-12 16:51:30.798629656 +0000 UTC m=+33.339941005,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:30 crc kubenswrapper[5184]: E0312 16:51:30.868855 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c261d90dbae65\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d90dbae65 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.703356005 +0000 UTC m=+4.244667344,LastTimestamp:2026-03-12 16:51:30.862972573 +0000 UTC m=+33.404284032,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:31 crc kubenswrapper[5184]: I0312 16:51:31.284455 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:31 crc kubenswrapper[5184]: I0312 16:51:31.551173 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 12 16:51:31 crc kubenswrapper[5184]: I0312 16:51:31.553454 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"eba7c9363d7ada9654ab6c11858db1b857c8885d94c0f1fd6c52443e7e374509"} Mar 12 16:51:31 crc kubenswrapper[5184]: I0312 16:51:31.553753 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:31 crc kubenswrapper[5184]: I0312 16:51:31.554717 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:31 crc kubenswrapper[5184]: I0312 16:51:31.554804 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:31 crc kubenswrapper[5184]: I0312 16:51:31.554827 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:31 crc kubenswrapper[5184]: E0312 16:51:31.555548 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.284842 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.406215 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.407174 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.407210 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.407222 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.407245 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:51:32 crc kubenswrapper[5184]: E0312 16:51:32.417263 5184 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.558267 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.559933 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.562116 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="eba7c9363d7ada9654ab6c11858db1b857c8885d94c0f1fd6c52443e7e374509" exitCode=255 Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.562217 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"eba7c9363d7ada9654ab6c11858db1b857c8885d94c0f1fd6c52443e7e374509"} Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.562269 5184 scope.go:117] "RemoveContainer" containerID="25d43a19faa74723e45d46c5c5efca3cc6c50d971f0ac488ccd11d1ba818dd36" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.562659 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.563547 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.563726 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.564322 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:32 crc kubenswrapper[5184]: E0312 16:51:32.565013 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:32 crc kubenswrapper[5184]: I0312 16:51:32.565617 5184 scope.go:117] "RemoveContainer" containerID="eba7c9363d7ada9654ab6c11858db1b857c8885d94c0f1fd6c52443e7e374509" Mar 12 16:51:32 crc kubenswrapper[5184]: E0312 16:51:32.566048 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 12 16:51:32 crc kubenswrapper[5184]: E0312 16:51:32.572350 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c26222e332c63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c26222e332c63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:21.522981987 +0000 UTC m=+24.064293366,LastTimestamp:2026-03-12 16:51:32.565998931 +0000 UTC m=+35.107310280,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:33 crc kubenswrapper[5184]: I0312 16:51:33.279346 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:33 crc kubenswrapper[5184]: I0312 16:51:33.567452 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 12 16:51:34 crc kubenswrapper[5184]: I0312 16:51:34.285454 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:34 crc kubenswrapper[5184]: E0312 16:51:34.933300 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 16:51:35 crc kubenswrapper[5184]: I0312 16:51:35.285295 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:36 crc kubenswrapper[5184]: I0312 16:51:36.284075 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:37 crc kubenswrapper[5184]: I0312 16:51:37.283734 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:38 crc kubenswrapper[5184]: I0312 16:51:38.284246 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:38 crc kubenswrapper[5184]: E0312 16:51:38.442743 5184 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 16:51:39 crc kubenswrapper[5184]: I0312 16:51:39.282944 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:39 crc kubenswrapper[5184]: I0312 16:51:39.418041 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:39 crc kubenswrapper[5184]: I0312 16:51:39.419315 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:39 crc kubenswrapper[5184]: I0312 16:51:39.419468 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:39 crc kubenswrapper[5184]: I0312 16:51:39.419507 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:39 crc kubenswrapper[5184]: I0312 16:51:39.419544 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:51:39 crc kubenswrapper[5184]: E0312 16:51:39.433262 5184 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 16:51:40 crc kubenswrapper[5184]: I0312 16:51:40.283103 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.284957 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.554482 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.554831 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.556117 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.556267 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.556360 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:41 crc kubenswrapper[5184]: E0312 16:51:41.557088 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.557560 5184 scope.go:117] "RemoveContainer" containerID="eba7c9363d7ada9654ab6c11858db1b857c8885d94c0f1fd6c52443e7e374509" Mar 12 16:51:41 crc kubenswrapper[5184]: E0312 16:51:41.557916 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 12 16:51:41 crc kubenswrapper[5184]: E0312 16:51:41.563496 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c26222e332c63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c26222e332c63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:21.522981987 +0000 UTC m=+24.064293366,LastTimestamp:2026-03-12 16:51:41.557874184 +0000 UTC m=+44.099185523,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.805033 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.806532 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.810185 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.810267 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.810291 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:41 crc kubenswrapper[5184]: E0312 16:51:41.811292 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:41 crc kubenswrapper[5184]: I0312 16:51:41.811881 5184 scope.go:117] "RemoveContainer" containerID="eba7c9363d7ada9654ab6c11858db1b857c8885d94c0f1fd6c52443e7e374509" Mar 12 16:51:41 crc kubenswrapper[5184]: E0312 16:51:41.812280 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 12 16:51:41 crc kubenswrapper[5184]: E0312 16:51:41.823552 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c26222e332c63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c26222e332c63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:21.522981987 +0000 UTC m=+24.064293366,LastTimestamp:2026-03-12 16:51:41.812227871 +0000 UTC m=+44.353539240,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:41 crc kubenswrapper[5184]: E0312 16:51:41.941976 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 16:51:42 crc kubenswrapper[5184]: I0312 16:51:42.279745 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:43 crc kubenswrapper[5184]: I0312 16:51:43.284708 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:44 crc kubenswrapper[5184]: I0312 16:51:44.284439 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:45 crc kubenswrapper[5184]: I0312 16:51:45.283077 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:45 crc kubenswrapper[5184]: E0312 16:51:45.729592 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 16:51:46 crc kubenswrapper[5184]: I0312 16:51:46.283646 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:46 crc kubenswrapper[5184]: I0312 16:51:46.434273 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:46 crc kubenswrapper[5184]: I0312 16:51:46.435533 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:46 crc kubenswrapper[5184]: I0312 16:51:46.435715 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:46 crc kubenswrapper[5184]: I0312 16:51:46.435867 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:46 crc kubenswrapper[5184]: I0312 16:51:46.436009 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:51:46 crc kubenswrapper[5184]: E0312 16:51:46.450793 5184 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 16:51:47 crc kubenswrapper[5184]: I0312 16:51:47.283150 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:48 crc kubenswrapper[5184]: I0312 16:51:48.284779 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:48 crc kubenswrapper[5184]: E0312 16:51:48.444088 5184 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 16:51:48 crc kubenswrapper[5184]: E0312 16:51:48.948432 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 16:51:49 crc kubenswrapper[5184]: I0312 16:51:49.284639 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:49 crc kubenswrapper[5184]: E0312 16:51:49.694577 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 16:51:50 crc kubenswrapper[5184]: I0312 16:51:50.280930 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:51 crc kubenswrapper[5184]: I0312 16:51:51.283978 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:51 crc kubenswrapper[5184]: E0312 16:51:51.432651 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 16:51:52 crc kubenswrapper[5184]: I0312 16:51:52.284618 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.285019 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.451943 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.453268 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.453520 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.453682 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.453883 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.464907 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.465168 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.466134 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.466183 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:53 crc kubenswrapper[5184]: I0312 16:51:53.466201 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:53 crc kubenswrapper[5184]: E0312 16:51:53.466840 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:53 crc kubenswrapper[5184]: E0312 16:51:53.470228 5184 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 16:51:54 crc kubenswrapper[5184]: I0312 16:51:54.284929 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:54 crc kubenswrapper[5184]: E0312 16:51:54.341291 5184 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 16:51:54 crc kubenswrapper[5184]: I0312 16:51:54.399271 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:54 crc kubenswrapper[5184]: I0312 16:51:54.400458 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:54 crc kubenswrapper[5184]: I0312 16:51:54.400518 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:54 crc kubenswrapper[5184]: I0312 16:51:54.400535 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:54 crc kubenswrapper[5184]: E0312 16:51:54.401019 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:54 crc kubenswrapper[5184]: I0312 16:51:54.401312 5184 scope.go:117] "RemoveContainer" containerID="eba7c9363d7ada9654ab6c11858db1b857c8885d94c0f1fd6c52443e7e374509" Mar 12 16:51:54 crc kubenswrapper[5184]: E0312 16:51:54.411636 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c261d80b10a14\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c261d80b10a14 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:01.432125972 +0000 UTC m=+3.973437311,LastTimestamp:2026-03-12 16:51:54.402643144 +0000 UTC m=+56.943954493,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:55 crc kubenswrapper[5184]: I0312 16:51:55.285150 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:55 crc kubenswrapper[5184]: I0312 16:51:55.635181 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 12 16:51:55 crc kubenswrapper[5184]: I0312 16:51:55.641515 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4"} Mar 12 16:51:55 crc kubenswrapper[5184]: I0312 16:51:55.641739 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:55 crc kubenswrapper[5184]: I0312 16:51:55.642524 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:55 crc kubenswrapper[5184]: I0312 16:51:55.642572 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:55 crc kubenswrapper[5184]: I0312 16:51:55.642586 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:55 crc kubenswrapper[5184]: E0312 16:51:55.643079 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:55 crc kubenswrapper[5184]: E0312 16:51:55.956052 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.284216 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.647139 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.647961 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.650408 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4" exitCode=255 Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.650482 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4"} Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.650550 5184 scope.go:117] "RemoveContainer" containerID="eba7c9363d7ada9654ab6c11858db1b857c8885d94c0f1fd6c52443e7e374509" Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.651071 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.652488 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.652544 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.652567 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:51:56 crc kubenswrapper[5184]: E0312 16:51:56.653102 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:51:56 crc kubenswrapper[5184]: I0312 16:51:56.653549 5184 scope.go:117] "RemoveContainer" containerID="c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4" Mar 12 16:51:56 crc kubenswrapper[5184]: E0312 16:51:56.653885 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 12 16:51:56 crc kubenswrapper[5184]: E0312 16:51:56.662014 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c26222e332c63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c26222e332c63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:21.522981987 +0000 UTC m=+24.064293366,LastTimestamp:2026-03-12 16:51:56.653843509 +0000 UTC m=+59.195154878,Count:6,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:51:57 crc kubenswrapper[5184]: I0312 16:51:57.284703 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:57 crc kubenswrapper[5184]: I0312 16:51:57.656164 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 12 16:51:58 crc kubenswrapper[5184]: I0312 16:51:58.285594 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:51:58 crc kubenswrapper[5184]: E0312 16:51:58.445335 5184 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 16:51:59 crc kubenswrapper[5184]: I0312 16:51:59.284464 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:52:00 crc kubenswrapper[5184]: I0312 16:52:00.284877 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:52:00 crc kubenswrapper[5184]: I0312 16:52:00.470593 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:52:00 crc kubenswrapper[5184]: I0312 16:52:00.471859 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:00 crc kubenswrapper[5184]: I0312 16:52:00.471910 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:00 crc kubenswrapper[5184]: I0312 16:52:00.471929 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:00 crc kubenswrapper[5184]: I0312 16:52:00.471961 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:52:00 crc kubenswrapper[5184]: E0312 16:52:00.486487 5184 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 16:52:01 crc kubenswrapper[5184]: I0312 16:52:01.283440 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:52:01 crc kubenswrapper[5184]: I0312 16:52:01.804186 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:52:01 crc kubenswrapper[5184]: I0312 16:52:01.804417 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:52:01 crc kubenswrapper[5184]: I0312 16:52:01.805254 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:01 crc kubenswrapper[5184]: I0312 16:52:01.805309 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:01 crc kubenswrapper[5184]: I0312 16:52:01.805322 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:01 crc kubenswrapper[5184]: E0312 16:52:01.805747 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:52:01 crc kubenswrapper[5184]: I0312 16:52:01.806067 5184 scope.go:117] "RemoveContainer" containerID="c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4" Mar 12 16:52:01 crc kubenswrapper[5184]: E0312 16:52:01.806306 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 12 16:52:01 crc kubenswrapper[5184]: E0312 16:52:01.810879 5184 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c26222e332c63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c26222e332c63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:51:21.522981987 +0000 UTC m=+24.064293366,LastTimestamp:2026-03-12 16:52:01.806268882 +0000 UTC m=+64.347580231,Count:7,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:52:02 crc kubenswrapper[5184]: I0312 16:52:02.283170 5184 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 16:52:02 crc kubenswrapper[5184]: I0312 16:52:02.540183 5184 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gsjld" Mar 12 16:52:02 crc kubenswrapper[5184]: I0312 16:52:02.546003 5184 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gsjld" Mar 12 16:52:02 crc kubenswrapper[5184]: I0312 16:52:02.555629 5184 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 12 16:52:03 crc kubenswrapper[5184]: I0312 16:52:03.108777 5184 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 12 16:52:03 crc kubenswrapper[5184]: I0312 16:52:03.547573 5184 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2026-04-11 16:47:02 +0000 UTC" deadline="2026-04-05 07:30:17.119402682 +0000 UTC" Mar 12 16:52:03 crc kubenswrapper[5184]: I0312 16:52:03.547633 5184 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="566h38m13.571775253s" Mar 12 16:52:05 crc kubenswrapper[5184]: I0312 16:52:05.642213 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:52:05 crc kubenswrapper[5184]: I0312 16:52:05.642428 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:52:05 crc kubenswrapper[5184]: I0312 16:52:05.643300 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:05 crc kubenswrapper[5184]: I0312 16:52:05.643321 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:05 crc kubenswrapper[5184]: I0312 16:52:05.643340 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:05 crc kubenswrapper[5184]: E0312 16:52:05.643701 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:52:05 crc kubenswrapper[5184]: I0312 16:52:05.643900 5184 scope.go:117] "RemoveContainer" containerID="c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4" Mar 12 16:52:05 crc kubenswrapper[5184]: E0312 16:52:05.644063 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.487524 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.488393 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.488490 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.488527 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.488689 5184 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.496450 5184 kubelet_node_status.go:127] "Node was previously registered" node="crc" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.496769 5184 kubelet_node_status.go:81] "Successfully registered node" node="crc" Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.496804 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.502575 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.502618 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.502635 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.502660 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.502678 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:07Z","lastTransitionTime":"2026-03-12T16:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.523005 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.537352 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.537429 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.537442 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.537462 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.537474 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:07Z","lastTransitionTime":"2026-03-12T16:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.546166 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.556112 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.556180 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.556192 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.556213 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.556227 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:07Z","lastTransitionTime":"2026-03-12T16:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.569010 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.578159 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.578205 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.578220 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.578238 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:07 crc kubenswrapper[5184]: I0312 16:52:07.578253 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:07Z","lastTransitionTime":"2026-03-12T16:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.586631 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.586977 5184 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.587034 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.687950 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.788998 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.889950 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:07 crc kubenswrapper[5184]: E0312 16:52:07.990117 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.090845 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.191608 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.292420 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.392948 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: I0312 16:52:08.399630 5184 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 16:52:08 crc kubenswrapper[5184]: I0312 16:52:08.400565 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:08 crc kubenswrapper[5184]: I0312 16:52:08.400604 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:08 crc kubenswrapper[5184]: I0312 16:52:08.400618 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.400930 5184 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.446783 5184 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.493623 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.594364 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.694803 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.795782 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.896888 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:08 crc kubenswrapper[5184]: E0312 16:52:08.997519 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:09 crc kubenswrapper[5184]: E0312 16:52:09.097943 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:09 crc kubenswrapper[5184]: E0312 16:52:09.198527 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:09 crc kubenswrapper[5184]: E0312 16:52:09.298780 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:09 crc kubenswrapper[5184]: E0312 16:52:09.399448 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:09 crc kubenswrapper[5184]: E0312 16:52:09.500471 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:09 crc kubenswrapper[5184]: E0312 16:52:09.601429 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:09 crc kubenswrapper[5184]: E0312 16:52:09.701981 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:09 crc kubenswrapper[5184]: E0312 16:52:09.802109 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:09 crc kubenswrapper[5184]: E0312 16:52:09.902559 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:10 crc kubenswrapper[5184]: E0312 16:52:10.002860 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:10 crc kubenswrapper[5184]: E0312 16:52:10.103620 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:10 crc kubenswrapper[5184]: E0312 16:52:10.204635 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:10 crc kubenswrapper[5184]: E0312 16:52:10.304883 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:10 crc kubenswrapper[5184]: E0312 16:52:10.405199 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:10 crc kubenswrapper[5184]: E0312 16:52:10.506024 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:10 crc kubenswrapper[5184]: E0312 16:52:10.606805 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:10 crc kubenswrapper[5184]: E0312 16:52:10.706980 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:10 crc kubenswrapper[5184]: E0312 16:52:10.807113 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:10 crc kubenswrapper[5184]: E0312 16:52:10.908181 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:11 crc kubenswrapper[5184]: E0312 16:52:11.008435 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:11 crc kubenswrapper[5184]: E0312 16:52:11.109477 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:11 crc kubenswrapper[5184]: E0312 16:52:11.210701 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:11 crc kubenswrapper[5184]: E0312 16:52:11.311091 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:11 crc kubenswrapper[5184]: E0312 16:52:11.411522 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:11 crc kubenswrapper[5184]: E0312 16:52:11.512015 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:11 crc kubenswrapper[5184]: E0312 16:52:11.612340 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:11 crc kubenswrapper[5184]: E0312 16:52:11.713544 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:11 crc kubenswrapper[5184]: E0312 16:52:11.814748 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:11 crc kubenswrapper[5184]: E0312 16:52:11.915140 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:12 crc kubenswrapper[5184]: E0312 16:52:12.016037 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:12 crc kubenswrapper[5184]: E0312 16:52:12.116211 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:12 crc kubenswrapper[5184]: E0312 16:52:12.216946 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:12 crc kubenswrapper[5184]: E0312 16:52:12.317252 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:12 crc kubenswrapper[5184]: E0312 16:52:12.418558 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:12 crc kubenswrapper[5184]: E0312 16:52:12.519643 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:12 crc kubenswrapper[5184]: E0312 16:52:12.619847 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:12 crc kubenswrapper[5184]: E0312 16:52:12.720178 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:12 crc kubenswrapper[5184]: E0312 16:52:12.821198 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:12 crc kubenswrapper[5184]: E0312 16:52:12.921534 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:13 crc kubenswrapper[5184]: E0312 16:52:13.022526 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:13 crc kubenswrapper[5184]: E0312 16:52:13.123687 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:13 crc kubenswrapper[5184]: E0312 16:52:13.223850 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:13 crc kubenswrapper[5184]: E0312 16:52:13.324193 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:13 crc kubenswrapper[5184]: E0312 16:52:13.424754 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:13 crc kubenswrapper[5184]: E0312 16:52:13.525326 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:13 crc kubenswrapper[5184]: E0312 16:52:13.626261 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:13 crc kubenswrapper[5184]: E0312 16:52:13.727061 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:13 crc kubenswrapper[5184]: E0312 16:52:13.827279 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:13 crc kubenswrapper[5184]: E0312 16:52:13.928068 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:14 crc kubenswrapper[5184]: E0312 16:52:14.028546 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:14 crc kubenswrapper[5184]: E0312 16:52:14.129111 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:14 crc kubenswrapper[5184]: E0312 16:52:14.229885 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:14 crc kubenswrapper[5184]: E0312 16:52:14.330755 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:14 crc kubenswrapper[5184]: E0312 16:52:14.431788 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:14 crc kubenswrapper[5184]: E0312 16:52:14.532712 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:14 crc kubenswrapper[5184]: E0312 16:52:14.633823 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:14 crc kubenswrapper[5184]: E0312 16:52:14.734165 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:14 crc kubenswrapper[5184]: E0312 16:52:14.834764 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:14 crc kubenswrapper[5184]: E0312 16:52:14.935203 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:15 crc kubenswrapper[5184]: E0312 16:52:15.035348 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:15 crc kubenswrapper[5184]: E0312 16:52:15.136133 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:15 crc kubenswrapper[5184]: E0312 16:52:15.236736 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:15 crc kubenswrapper[5184]: E0312 16:52:15.337475 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:15 crc kubenswrapper[5184]: E0312 16:52:15.438438 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:15 crc kubenswrapper[5184]: E0312 16:52:15.538665 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:15 crc kubenswrapper[5184]: E0312 16:52:15.639808 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:15 crc kubenswrapper[5184]: E0312 16:52:15.740314 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:15 crc kubenswrapper[5184]: E0312 16:52:15.841613 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:15 crc kubenswrapper[5184]: E0312 16:52:15.942334 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:16 crc kubenswrapper[5184]: E0312 16:52:16.043171 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:16 crc kubenswrapper[5184]: E0312 16:52:16.143588 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:16 crc kubenswrapper[5184]: E0312 16:52:16.244697 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:16 crc kubenswrapper[5184]: E0312 16:52:16.345035 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:16 crc kubenswrapper[5184]: E0312 16:52:16.446232 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:16 crc kubenswrapper[5184]: E0312 16:52:16.547262 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:16 crc kubenswrapper[5184]: E0312 16:52:16.647822 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:16 crc kubenswrapper[5184]: E0312 16:52:16.748817 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:16 crc kubenswrapper[5184]: E0312 16:52:16.849452 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:16 crc kubenswrapper[5184]: E0312 16:52:16.949852 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.051045 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.151452 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.251563 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.352496 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.452661 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.553475 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.654095 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.755226 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.792924 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.797894 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.797927 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.797938 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.797954 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.797965 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:17Z","lastTransitionTime":"2026-03-12T16:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.812490 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.825018 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.825112 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.825140 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.825210 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.825238 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:17Z","lastTransitionTime":"2026-03-12T16:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.842557 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.854697 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.854736 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.854749 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.854765 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.854778 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:17Z","lastTransitionTime":"2026-03-12T16:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.872774 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.883176 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.883225 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.883241 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.883260 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:17 crc kubenswrapper[5184]: I0312 16:52:17.883271 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:17Z","lastTransitionTime":"2026-03-12T16:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.895252 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.895758 5184 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.895812 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:17 crc kubenswrapper[5184]: E0312 16:52:17.996829 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:18 crc kubenswrapper[5184]: E0312 16:52:18.097341 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:18 crc kubenswrapper[5184]: E0312 16:52:18.198140 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:18 crc kubenswrapper[5184]: E0312 16:52:18.298919 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:18 crc kubenswrapper[5184]: E0312 16:52:18.399661 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:18 crc kubenswrapper[5184]: E0312 16:52:18.447808 5184 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 16:52:18 crc kubenswrapper[5184]: E0312 16:52:18.500088 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:18 crc kubenswrapper[5184]: E0312 16:52:18.601158 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:18 crc kubenswrapper[5184]: E0312 16:52:18.702130 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:18 crc kubenswrapper[5184]: E0312 16:52:18.803297 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:18 crc kubenswrapper[5184]: E0312 16:52:18.903996 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:19 crc kubenswrapper[5184]: E0312 16:52:19.004619 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:19 crc kubenswrapper[5184]: E0312 16:52:19.104933 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:19 crc kubenswrapper[5184]: E0312 16:52:19.206041 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:19 crc kubenswrapper[5184]: E0312 16:52:19.307128 5184 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.336306 5184 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.406982 5184 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.410191 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.410274 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.410301 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.410334 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.410352 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:19Z","lastTransitionTime":"2026-03-12T16:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.424904 5184 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.512054 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.512092 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.512103 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.512117 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.512125 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:19Z","lastTransitionTime":"2026-03-12T16:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.522448 5184 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.614139 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.614181 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.614190 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.614203 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.614213 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:19Z","lastTransitionTime":"2026-03-12T16:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.622820 5184 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.716350 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.716455 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.716470 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.716488 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.716499 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:19Z","lastTransitionTime":"2026-03-12T16:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.724092 5184 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-etcd/etcd-crc" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.823263 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.823337 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.823362 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.823426 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.823447 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:19Z","lastTransitionTime":"2026-03-12T16:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.925973 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.926086 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.926106 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.926148 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:19 crc kubenswrapper[5184]: I0312 16:52:19.926171 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:19Z","lastTransitionTime":"2026-03-12T16:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.028887 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.028965 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.028985 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.029009 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.029027 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:20Z","lastTransitionTime":"2026-03-12T16:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.132073 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.132127 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.132138 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.132157 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.132169 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:20Z","lastTransitionTime":"2026-03-12T16:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.234194 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.234516 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.234585 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.234667 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.234728 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:20Z","lastTransitionTime":"2026-03-12T16:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.306336 5184 apiserver.go:52] "Watching apiserver" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.316573 5184 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.317525 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-cp7pt","openshift-multus/multus-99gtj","openshift-ovn-kubernetes/ovnkube-node-6bpj2","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5","openshift-network-node-identity/network-node-identity-dgvkt","openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs","openshift-image-registry/node-ca-tnk2c","openshift-multus/multus-additional-cni-plugins-ckfz2","openshift-multus/network-metrics-daemon-vxc4c","openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6","openshift-network-diagnostics/network-check-target-fhkjl","openshift-dns/node-resolver-ggxxl","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-operator/iptables-alerter-5jnd7","openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv"] Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.319029 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.319876 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.319975 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.321204 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.321265 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.321783 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"metrics-tls\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.322501 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.322722 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.324055 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.324115 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"openshift-service-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.324554 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-node-identity\"/\"network-node-identity-cert\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.325019 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"kube-root-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.325078 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"ovnkube-identity-cm\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.325473 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.325490 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.325559 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.325930 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"env-overrides\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.326202 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.332117 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.334476 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g6kgg\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.334803 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.335262 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.335429 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.335659 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.337469 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.337500 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.337511 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.337528 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.337538 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:20Z","lastTransitionTime":"2026-03-12T16:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.338828 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.338847 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.341038 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tk7bt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.342182 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.342485 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.345294 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.347001 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.347994 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.348249 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-w9nzh\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.348647 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.349690 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"proxy-tls\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.349977 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.352683 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.352912 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.352920 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.353214 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tjs74\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.354229 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.354526 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.357712 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.358339 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.365636 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nwglk\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.382854 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.415211 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.420142 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.420471 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.420561 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.420587 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.420735 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-nl8tp\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.423483 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.426329 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.426457 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.426835 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.430188 5184 scope.go:117] "RemoveContainer" containerID="c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.430454 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.430515 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l2v2m\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.430510 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.435912 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.439343 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.439408 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.439424 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.439445 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.439461 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:20Z","lastTransitionTime":"2026-03-12T16:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.446706 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.458577 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-99gtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"542903c2-fc88-4085-979a-db3766958392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djfvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-99gtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.466917 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-ggxxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1239377-fc5d-40f2-b262-0b9c9448a3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k47hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ggxxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.475900 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-netns\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.475941 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-bin\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476043 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nt2j\" (UniqueName: \"kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476091 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k47hx\" (UniqueName: \"kubernetes.io/projected/c1239377-fc5d-40f2-b262-0b9c9448a3cf-kube-api-access-k47hx\") pod \"node-resolver-ggxxl\" (UID: \"c1239377-fc5d-40f2-b262-0b9c9448a3cf\") " pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476123 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-var-lib-openvswitch\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476189 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-netd\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476303 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-env-overrides\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476434 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-os-release\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476465 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476493 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476518 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476539 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-node-log\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476558 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-log-socket\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476524 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476623 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-var-lib-kubelet\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476660 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476683 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476712 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovn-node-metrics-cert\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.476587 5184 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476803 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trmc6\" (UniqueName: \"kubernetes.io/projected/766663a7-2c04-43da-a76f-dfacc5b1583a-kube-api-access-trmc6\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476834 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c823004-cd7d-4cea-9cdb-b44a806264ab-host\") pod \"node-ca-tnk2c\" (UID: \"9c823004-cd7d-4cea-9cdb-b44a806264ab\") " pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476871 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.476910 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:20.976876316 +0000 UTC m=+83.518187695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.476973 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7xz2\" (UniqueName: \"kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477021 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/542903c2-fc88-4085-979a-db3766958392-cni-binary-copy\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477159 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-run-k8s-cni-cncf-io\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477202 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9c823004-cd7d-4cea-9cdb-b44a806264ab-serviceca\") pod \"node-ca-tnk2c\" (UID: \"9c823004-cd7d-4cea-9cdb-b44a806264ab\") " pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477239 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477271 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477302 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c1239377-fc5d-40f2-b262-0b9c9448a3cf-hosts-file\") pod \"node-resolver-ggxxl\" (UID: \"c1239377-fc5d-40f2-b262-0b9c9448a3cf\") " pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477333 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/766663a7-2c04-43da-a76f-dfacc5b1583a-cni-binary-copy\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477437 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsgwk\" (UniqueName: \"kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477474 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-kubelet\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477501 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-openvswitch\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477528 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b45c859-3d05-4214-9bd3-2952546f5dea-mcd-auth-proxy-config\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477561 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-var-lib-cni-multus\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477590 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-hostroot\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477629 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477693 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477771 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-multus-conf-dir\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477822 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-run-multus-certs\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477857 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b45c859-3d05-4214-9bd3-2952546f5dea-proxy-tls\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477890 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/766663a7-2c04-43da-a76f-dfacc5b1583a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477924 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477954 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf2p5\" (UniqueName: \"kubernetes.io/projected/417740d6-e9c9-4fa8-9811-c6704b5b5692-kube-api-access-wf2p5\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.477991 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-var-lib-cni-bin\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478061 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/542903c2-fc88-4085-979a-db3766958392-multus-daemon-config\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478124 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c1239377-fc5d-40f2-b262-0b9c9448a3cf-tmp-dir\") pod \"node-resolver-ggxxl\" (UID: \"c1239377-fc5d-40f2-b262-0b9c9448a3cf\") " pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478167 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478202 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-slash\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478233 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-etc-openvswitch\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478268 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-multus-socket-dir-parent\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478301 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-os-release\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478331 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-system-cni-dir\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478361 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-multus-cni-dir\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478422 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7b45c859-3d05-4214-9bd3-2952546f5dea-rootfs\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478431 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478456 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-cnibin\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478489 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4cp\" (UniqueName: \"kubernetes.io/projected/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-kube-api-access-qz4cp\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478577 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478912 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-system-cni-dir\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.478966 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-ovn\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479001 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-script-lib\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479058 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-cnibin\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479091 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ljt6\" (UniqueName: \"kubernetes.io/projected/7b45c859-3d05-4214-9bd3-2952546f5dea-kube-api-access-8ljt6\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479126 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/766663a7-2c04-43da-a76f-dfacc5b1583a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479177 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479225 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwc2f\" (UniqueName: \"kubernetes.io/projected/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-kube-api-access-bwc2f\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479285 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-config\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479339 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-run-netns\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479399 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-etc-kubernetes\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479439 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7jw8\" (UniqueName: \"kubernetes.io/projected/9c823004-cd7d-4cea-9cdb-b44a806264ab-kube-api-access-w7jw8\") pod \"node-ca-tnk2c\" (UID: \"9c823004-cd7d-4cea-9cdb-b44a806264ab\") " pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479464 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479490 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-systemd-units\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479537 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479612 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djfvr\" (UniqueName: \"kubernetes.io/projected/542903c2-fc88-4085-979a-db3766958392-kube-api-access-djfvr\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479655 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479690 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.479726 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-systemd\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.479777 5184 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.479857 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:20.979835667 +0000 UTC m=+83.521147016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.482690 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.483239 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.484537 5184 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.487660 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.492076 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.492141 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.493158 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.493245 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.493301 5184 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.493429 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:20.993411548 +0000 UTC m=+83.534722887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.494989 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.495081 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.495135 5184 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.495222 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:20.995212715 +0000 UTC m=+83.536524054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.495436 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsgwk\" (UniqueName: \"kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.496572 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7xz2\" (UniqueName: \"kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.501826 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nt2j\" (UniqueName: \"kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.504739 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.511834 5184 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.517165 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.529439 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-99gtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"542903c2-fc88-4085-979a-db3766958392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djfvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-99gtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.536407 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-ggxxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1239377-fc5d-40f2-b262-0b9c9448a3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k47hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ggxxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.541366 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.541524 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.541603 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.541684 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.541762 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:20Z","lastTransitionTime":"2026-03-12T16:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.545832 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c823004-cd7d-4cea-9cdb-b44a806264ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7jw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.557466 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417740d6-e9c9-4fa8-9811-c6704b5b5692\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-wqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.565986 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.575086 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b45c859-3d05-4214-9bd3-2952546f5dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp7pt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580541 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z4sw\" (UniqueName: \"kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580575 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580595 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hb7m\" (UniqueName: \"kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580612 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rmnv\" (UniqueName: \"kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580627 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580644 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580661 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580677 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580693 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m26jq\" (UniqueName: \"kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580712 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580728 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580745 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580760 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580779 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580795 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptkcf\" (UniqueName: \"kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580811 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580856 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jjkz\" (UniqueName: \"kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580870 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580890 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxfcv\" (UniqueName: \"kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580905 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddlk9\" (UniqueName: \"kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580920 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580934 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580949 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5rsr\" (UniqueName: \"kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580963 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580978 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9stx\" (UniqueName: \"kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.580999 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.581022 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.581057 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.581072 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.581086 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.581110 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.581130 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.581838 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582051 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582159 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582224 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582280 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582329 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582418 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582755 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582918 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582418 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz" (OuterVolumeSpecName: "kube-api-access-7jjkz") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "kube-api-access-7jjkz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582569 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv" (OuterVolumeSpecName: "kube-api-access-xxfcv") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "kube-api-access-xxfcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.584712 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582570 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv" (OuterVolumeSpecName: "kube-api-access-6rmnv") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "kube-api-access-6rmnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.582876 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.583025 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.583236 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.583473 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config" (OuterVolumeSpecName: "config") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.583517 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9" (OuterVolumeSpecName: "kube-api-access-ddlk9") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "kube-api-access-ddlk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.583924 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq" (OuterVolumeSpecName: "kube-api-access-m26jq") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "kube-api-access-m26jq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.584133 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m" (OuterVolumeSpecName: "kube-api-access-4hb7m") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "kube-api-access-4hb7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.584262 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw" (OuterVolumeSpecName: "kube-api-access-9z4sw") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "kube-api-access-9z4sw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.584480 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.584535 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr" (OuterVolumeSpecName: "kube-api-access-z5rsr") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "kube-api-access-z5rsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.585088 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp" (OuterVolumeSpecName: "tmp") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.585209 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities" (OuterVolumeSpecName: "utilities") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.585279 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf" (OuterVolumeSpecName: "kube-api-access-ptkcf") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "kube-api-access-ptkcf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.585332 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.585362 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.585653 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.585713 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.585837 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp" (OuterVolumeSpecName: "tmp") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.585992 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.586023 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.586227 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config" (OuterVolumeSpecName: "config") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.586734 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.586625 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs\") pod \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\" (UID: \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.586863 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.586884 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities" (OuterVolumeSpecName: "utilities") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.586915 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsb9b\" (UniqueName: \"kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.586951 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks6v2\" (UniqueName: \"kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587011 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587035 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca" (OuterVolumeSpecName: "client-ca") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587091 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587126 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587156 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587189 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587335 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587351 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587885 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "0dd0fbac-8c0d-4228-8faa-abbeedabf7db" (UID: "0dd0fbac-8c0d-4228-8faa-abbeedabf7db"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587972 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587982 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2" (OuterVolumeSpecName: "kube-api-access-ks6v2") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "kube-api-access-ks6v2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.587968 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.588122 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx" (OuterVolumeSpecName: "kube-api-access-l9stx") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "kube-api-access-l9stx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.588673 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.588905 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.588926 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b" (OuterVolumeSpecName: "kube-api-access-zsb9b") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "kube-api-access-zsb9b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.588970 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.588971 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config" (OuterVolumeSpecName: "config") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589030 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftwb6\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589064 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589229 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgrkj\" (UniqueName: \"kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj\") pod \"42a11a02-47e1-488f-b270-2679d3298b0e\" (UID: \"42a11a02-47e1-488f-b270-2679d3298b0e\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589271 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589303 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589335 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589367 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589422 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589457 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589492 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pskd\" (UniqueName: \"kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589529 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589562 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589593 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7cps\" (UniqueName: \"kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps\") pod \"af41de71-79cf-4590-bbe9-9e8b848862cb\" (UID: \"af41de71-79cf-4590-bbe9-9e8b848862cb\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589624 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589658 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4smf\" (UniqueName: \"kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf\") pod \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\" (UID: \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589691 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589722 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589760 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls\") pod \"42a11a02-47e1-488f-b270-2679d3298b0e\" (UID: \"42a11a02-47e1-488f-b270-2679d3298b0e\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589797 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589829 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589863 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589894 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589925 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589958 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589990 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590021 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590130 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590163 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590196 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590228 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590259 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590291 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590322 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590356 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjwtd\" (UniqueName: \"kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590413 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590448 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26xrl\" (UniqueName: \"kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl\") pod \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\" (UID: \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590487 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590519 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twvbl\" (UniqueName: \"kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590552 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590585 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g8ts\" (UniqueName: \"kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590619 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grwfz\" (UniqueName: \"kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589130 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589786 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6" (OuterVolumeSpecName: "kube-api-access-ftwb6") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "kube-api-access-ftwb6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589895 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.589995 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590475 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd" (OuterVolumeSpecName: "kube-api-access-8pskd") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "kube-api-access-8pskd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590672 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "42a11a02-47e1-488f-b270-2679d3298b0e" (UID: "42a11a02-47e1-488f-b270-2679d3298b0e"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.590991 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj" (OuterVolumeSpecName: "kube-api-access-qgrkj") pod "42a11a02-47e1-488f-b270-2679d3298b0e" (UID: "42a11a02-47e1-488f-b270-2679d3298b0e"). InnerVolumeSpecName "kube-api-access-qgrkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.591233 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.591528 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.592065 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.591649 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:21.091605735 +0000 UTC m=+83.632917124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.592343 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config" (OuterVolumeSpecName: "config") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.591670 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.591807 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.592244 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.592454 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.592363 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.592943 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.593123 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.593488 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config" (OuterVolumeSpecName: "config") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.593725 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem" (OuterVolumeSpecName: "ca-trust-extracted-pem") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "ca-trust-extracted-pem". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.593800 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf" (OuterVolumeSpecName: "kube-api-access-q4smf") pod "0dd0fbac-8c0d-4228-8faa-abbeedabf7db" (UID: "0dd0fbac-8c0d-4228-8faa-abbeedabf7db"). InnerVolumeSpecName "kube-api-access-q4smf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.593845 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config" (OuterVolumeSpecName: "config") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.593917 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.594252 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl" (OuterVolumeSpecName: "kube-api-access-twvbl") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "kube-api-access-twvbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.594519 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps" (OuterVolumeSpecName: "kube-api-access-d7cps") pod "af41de71-79cf-4590-bbe9-9e8b848862cb" (UID: "af41de71-79cf-4590-bbe9-9e8b848862cb"). InnerVolumeSpecName "kube-api-access-d7cps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595034 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595087 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595117 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595141 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595172 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595195 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595219 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tknt7\" (UniqueName: \"kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595243 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595291 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config" (OuterVolumeSpecName: "config") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.594395 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities" (OuterVolumeSpecName: "utilities") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595607 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs" (OuterVolumeSpecName: "certs") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595615 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs\") pod \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\" (UID: \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595693 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595730 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595755 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config" (OuterVolumeSpecName: "config") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595775 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595881 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595933 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595967 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g4lr\" (UniqueName: \"kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr\") pod \"f7e2c886-118e-43bb-bef1-c78134de392b\" (UID: \"f7e2c886-118e-43bb-bef1-c78134de392b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.595975 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596003 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596037 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596069 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nspp\" (UniqueName: \"kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596113 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596147 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596183 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zth6t\" (UniqueName: \"kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596217 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbc2l\" (UniqueName: \"kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596250 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596320 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596361 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596420 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596458 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lcfw\" (UniqueName: \"kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596493 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596534 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596568 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596606 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596690 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmmzf\" (UniqueName: \"kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596728 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir\") pod \"f7e2c886-118e-43bb-bef1-c78134de392b\" (UID: \"f7e2c886-118e-43bb-bef1-c78134de392b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596004 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596180 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" (UID: "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596213 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images" (OuterVolumeSpecName: "images") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596275 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596486 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp" (OuterVolumeSpecName: "tmp") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596678 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities" (OuterVolumeSpecName: "utilities") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.596704 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl" (OuterVolumeSpecName: "kube-api-access-26xrl") pod "a208c9c2-333b-4b4a-be0d-bc32ec38a821" (UID: "a208c9c2-333b-4b4a-be0d-bc32ec38a821"). InnerVolumeSpecName "kube-api-access-26xrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.597262 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.597528 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca" (OuterVolumeSpecName: "client-ca") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.597571 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7" (OuterVolumeSpecName: "kube-api-access-tknt7") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "kube-api-access-tknt7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.597831 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts" (OuterVolumeSpecName: "kube-api-access-4g8ts") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "kube-api-access-4g8ts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.597825 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca" (OuterVolumeSpecName: "service-ca") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.597927 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598129 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598563 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert\") pod \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\" (UID: \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598613 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598649 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598681 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598717 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598749 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598780 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598817 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598855 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94l9h\" (UniqueName: \"kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598905 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598942 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.598976 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqbfk\" (UniqueName: \"kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599011 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599042 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599074 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599108 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599140 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599174 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599214 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599247 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599286 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nb9c\" (UniqueName: \"kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599320 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599361 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfzkj\" (UniqueName: \"kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj\") pod \"0effdbcf-dd7d-404d-9d48-77536d665a5d\" (UID: \"0effdbcf-dd7d-404d-9d48-77536d665a5d\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599424 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599462 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599497 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599535 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599574 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599607 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj4qr\" (UniqueName: \"kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599640 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbmqg\" (UniqueName: \"kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599681 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pddnv\" (UniqueName: \"kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv\") pod \"e093be35-bb62-4843-b2e8-094545761610\" (UID: \"e093be35-bb62-4843-b2e8-094545761610\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599714 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599755 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgx6b\" (UniqueName: \"kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b\") pod \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\" (UID: \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599789 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm9x7\" (UniqueName: \"kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599828 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4tqq\" (UniqueName: \"kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq\") pod \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\" (UID: \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599881 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfp5s\" (UniqueName: \"kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599938 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pllx6\" (UniqueName: \"kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.599987 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600021 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600058 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600091 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600125 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600166 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600234 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600271 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600349 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws8zz\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600410 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600444 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600479 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l87hs\" (UniqueName: \"kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs\") pod \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\" (UID: \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600512 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600547 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600581 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600617 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600655 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600689 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600722 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.600756 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601211 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz" (OuterVolumeSpecName: "kube-api-access-grwfz") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "kube-api-access-grwfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601271 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601323 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601344 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601470 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601575 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601623 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w94wk\" (UniqueName: \"kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601658 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601693 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99zj9\" (UniqueName: \"kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601985 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602035 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602362 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602463 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602537 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vsz9\" (UniqueName: \"kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602578 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602613 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602650 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert\") pod \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\" (UID: \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602688 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzt4w\" (UniqueName: \"kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w\") pod \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\" (UID: \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602725 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnxbn\" (UniqueName: \"kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602766 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602811 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg8nc\" (UniqueName: \"kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602851 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602893 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dztfv\" (UniqueName: \"kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602935 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca\") pod \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\" (UID: \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603412 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603502 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603562 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603640 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkdh6\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603725 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hckvg\" (UniqueName: \"kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603768 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603797 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dmhf\" (UniqueName: \"kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603835 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603869 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603911 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603945 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603983 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls\") pod \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\" (UID: \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604031 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604075 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604121 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604170 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604208 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604261 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5lgh\" (UniqueName: \"kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604347 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604419 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604461 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604563 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-system-cni-dir\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604615 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-multus-cni-dir\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604657 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7b45c859-3d05-4214-9bd3-2952546f5dea-rootfs\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604694 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-cnibin\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604732 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4cp\" (UniqueName: \"kubernetes.io/projected/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-kube-api-access-qz4cp\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604819 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-system-cni-dir\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604868 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-ovn\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604908 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-script-lib\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604955 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-cnibin\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604998 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ljt6\" (UniqueName: \"kubernetes.io/projected/7b45c859-3d05-4214-9bd3-2952546f5dea-kube-api-access-8ljt6\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.605041 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/766663a7-2c04-43da-a76f-dfacc5b1583a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.605093 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.605136 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwc2f\" (UniqueName: \"kubernetes.io/projected/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-kube-api-access-bwc2f\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.605214 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f335ad31-84ab-4bea-b0f2-75eca434a55d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://7901646b9904fb6a100644a0aacd978a71373a764eea536a29abd51530037c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://850d2c82f02982ba13abc9b9365f5be589329d37001cd14054004a85c6d2e96a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ad74805d602978f7f836e80d83b2ef81f7c4cbbc65155a2268057928abb2f906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://7696ac1bb100db5ea88c53ca29f38064d11d2600b968872de07c222ad6411720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e7c8113246b3ae53a45615d0812c94a8ef10af18f75a421afdf0afa3ecb09223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.608581 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-config\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.609765 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-run-netns\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.609806 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-etc-kubernetes\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.609838 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7jw8\" (UniqueName: \"kubernetes.io/projected/9c823004-cd7d-4cea-9cdb-b44a806264ab-kube-api-access-w7jw8\") pod \"node-ca-tnk2c\" (UID: \"9c823004-cd7d-4cea-9cdb-b44a806264ab\") " pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610242 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610357 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-systemd-units\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610413 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610452 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djfvr\" (UniqueName: \"kubernetes.io/projected/542903c2-fc88-4085-979a-db3766958392-kube-api-access-djfvr\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610561 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-systemd\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610611 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-netns\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610870 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-bin\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610944 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k47hx\" (UniqueName: \"kubernetes.io/projected/c1239377-fc5d-40f2-b262-0b9c9448a3cf-kube-api-access-k47hx\") pod \"node-resolver-ggxxl\" (UID: \"c1239377-fc5d-40f2-b262-0b9c9448a3cf\") " pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610982 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-var-lib-openvswitch\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611027 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-netd\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611067 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-env-overrides\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611114 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-os-release\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611227 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611273 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611299 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-node-log\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611325 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-log-socket\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611351 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-var-lib-kubelet\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611403 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611444 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611483 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovn-node-metrics-cert\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611690 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-ovn\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611781 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trmc6\" (UniqueName: \"kubernetes.io/projected/766663a7-2c04-43da-a76f-dfacc5b1583a-kube-api-access-trmc6\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611979 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c823004-cd7d-4cea-9cdb-b44a806264ab-host\") pod \"node-ca-tnk2c\" (UID: \"9c823004-cd7d-4cea-9cdb-b44a806264ab\") " pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611432 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-system-cni-dir\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612135 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/542903c2-fc88-4085-979a-db3766958392-cni-binary-copy\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612214 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-run-k8s-cni-cncf-io\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612316 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9c823004-cd7d-4cea-9cdb-b44a806264ab-serviceca\") pod \"node-ca-tnk2c\" (UID: \"9c823004-cd7d-4cea-9cdb-b44a806264ab\") " pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612477 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c1239377-fc5d-40f2-b262-0b9c9448a3cf-hosts-file\") pod \"node-resolver-ggxxl\" (UID: \"c1239377-fc5d-40f2-b262-0b9c9448a3cf\") " pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612533 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/766663a7-2c04-43da-a76f-dfacc5b1583a-cni-binary-copy\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612607 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-kubelet\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612645 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-openvswitch\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612674 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b45c859-3d05-4214-9bd3-2952546f5dea-mcd-auth-proxy-config\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612709 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-var-lib-cni-multus\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612735 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-hostroot\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612775 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-multus-conf-dir\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612802 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-run-multus-certs\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612832 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b45c859-3d05-4214-9bd3-2952546f5dea-proxy-tls\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612858 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/766663a7-2c04-43da-a76f-dfacc5b1583a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612899 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612930 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wf2p5\" (UniqueName: \"kubernetes.io/projected/417740d6-e9c9-4fa8-9811-c6704b5b5692-kube-api-access-wf2p5\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612977 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-var-lib-cni-bin\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613017 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/542903c2-fc88-4085-979a-db3766958392-multus-daemon-config\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613069 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c1239377-fc5d-40f2-b262-0b9c9448a3cf-tmp-dir\") pod \"node-resolver-ggxxl\" (UID: \"c1239377-fc5d-40f2-b262-0b9c9448a3cf\") " pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613097 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613122 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-slash\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613151 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-etc-openvswitch\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613181 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-multus-socket-dir-parent\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613210 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-os-release\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613307 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hb7m\" (UniqueName: \"kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613331 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rmnv\" (UniqueName: \"kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613347 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613348 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-script-lib\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613360 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613493 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613537 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613578 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m26jq\" (UniqueName: \"kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613611 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-os-release\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613628 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613652 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-config\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613664 5184 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613698 5184 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613728 5184 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613758 5184 reconciler_common.go:299] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613790 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ptkcf\" (UniqueName: \"kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613821 5184 reconciler_common.go:299] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613853 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7jjkz\" (UniqueName: \"kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613884 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613916 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxfcv\" (UniqueName: \"kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613948 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddlk9\" (UniqueName: \"kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613980 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613995 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-hostroot\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614013 5184 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614046 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z5rsr\" (UniqueName: \"kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614080 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614117 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l9stx\" (UniqueName: \"kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614147 5184 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614176 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614207 5184 reconciler_common.go:299] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614235 5184 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614264 5184 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614295 5184 reconciler_common.go:299] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614327 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614366 5184 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614439 5184 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614474 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614504 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614534 5184 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614570 5184 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614599 5184 reconciler_common.go:299] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614634 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614667 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zsb9b\" (UniqueName: \"kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614700 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ks6v2\" (UniqueName: \"kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614731 5184 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614763 5184 reconciler_common.go:299] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614793 5184 reconciler_common.go:299] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614824 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614852 5184 reconciler_common.go:299] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614885 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ftwb6\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614916 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.633844 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qgrkj\" (UniqueName: \"kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.633891 5184 reconciler_common.go:299] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.633907 5184 reconciler_common.go:299] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.633962 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.633984 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634000 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634017 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8pskd\" (UniqueName: \"kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634030 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634041 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634055 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d7cps\" (UniqueName: \"kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634067 5184 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634066 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/766663a7-2c04-43da-a76f-dfacc5b1583a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614445 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c823004-cd7d-4cea-9cdb-b44a806264ab-host\") pod \"node-ca-tnk2c\" (UID: \"9c823004-cd7d-4cea-9cdb-b44a806264ab\") " pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634078 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4smf\" (UniqueName: \"kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634120 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634135 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634151 5184 reconciler_common.go:299] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.633466 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-multus-socket-dir-parent\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634173 5184 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.629803 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-run-k8s-cni-cncf-io\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.629916 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-cnibin\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.629976 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-multus-cni-dir\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.630011 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7b45c859-3d05-4214-9bd3-2952546f5dea-rootfs\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.630302 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-cnibin\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.630737 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-system-cni-dir\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.631041 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9c823004-cd7d-4cea-9cdb-b44a806264ab-serviceca\") pod \"node-ca-tnk2c\" (UID: \"9c823004-cd7d-4cea-9cdb-b44a806264ab\") " pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.631103 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c1239377-fc5d-40f2-b262-0b9c9448a3cf-hosts-file\") pod \"node-resolver-ggxxl\" (UID: \"c1239377-fc5d-40f2-b262-0b9c9448a3cf\") " pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601577 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr" (OuterVolumeSpecName: "kube-api-access-6g4lr") pod "f7e2c886-118e-43bb-bef1-c78134de392b" (UID: "f7e2c886-118e-43bb-bef1-c78134de392b"). InnerVolumeSpecName "kube-api-access-6g4lr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601925 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.601949 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf" (OuterVolumeSpecName: "kube-api-access-nmmzf") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "kube-api-access-nmmzf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602242 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602552 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities" (OuterVolumeSpecName: "utilities") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602772 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities" (OuterVolumeSpecName: "utilities") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602781 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.602936 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg" (OuterVolumeSpecName: "kube-api-access-wbmqg") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "kube-api-access-wbmqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603155 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603163 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603302 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config" (OuterVolumeSpecName: "config") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603321 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.603671 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs" (OuterVolumeSpecName: "kube-api-access-l87hs") pod "5ebfebf6-3ecd-458e-943f-bb25b52e2718" (UID: "5ebfebf6-3ecd-458e-943f-bb25b52e2718"). InnerVolumeSpecName "kube-api-access-l87hs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604106 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities" (OuterVolumeSpecName: "utilities") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604469 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw" (OuterVolumeSpecName: "kube-api-access-5lcfw") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "kube-api-access-5lcfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.604982 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.605258 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.605628 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca" (OuterVolumeSpecName: "serviceca") pod "5ebfebf6-3ecd-458e-943f-bb25b52e2718" (UID: "5ebfebf6-3ecd-458e-943f-bb25b52e2718"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.605724 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.605889 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.606682 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "a208c9c2-333b-4b4a-be0d-bc32ec38a821" (UID: "a208c9c2-333b-4b4a-be0d-bc32ec38a821"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.606760 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.607397 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config" (OuterVolumeSpecName: "config") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.607526 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd" (OuterVolumeSpecName: "kube-api-access-mjwtd") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "kube-api-access-mjwtd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.609192 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.609595 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp" (OuterVolumeSpecName: "kube-api-access-8nspp") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "kube-api-access-8nspp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.609607 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.609794 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv" (OuterVolumeSpecName: "kube-api-access-pddnv") pod "e093be35-bb62-4843-b2e8-094545761610" (UID: "e093be35-bb62-4843-b2e8-094545761610"). InnerVolumeSpecName "kube-api-access-pddnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.609801 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610081 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610165 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp" (OuterVolumeSpecName: "tmp") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610182 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b" (OuterVolumeSpecName: "kube-api-access-pgx6b") pod "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" (UID: "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4"). InnerVolumeSpecName "kube-api-access-pgx6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610373 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610575 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l" (OuterVolumeSpecName: "kube-api-access-sbc2l") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "kube-api-access-sbc2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610807 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "f7e2c886-118e-43bb-bef1-c78134de392b" (UID: "f7e2c886-118e-43bb-bef1-c78134de392b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610911 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610981 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config" (OuterVolumeSpecName: "config") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610979 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.610999 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611139 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611328 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611527 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6" (OuterVolumeSpecName: "kube-api-access-pllx6") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "kube-api-access-pllx6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611588 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611731 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj" (OuterVolumeSpecName: "kube-api-access-mfzkj") pod "0effdbcf-dd7d-404d-9d48-77536d665a5d" (UID: "0effdbcf-dd7d-404d-9d48-77536d665a5d"). InnerVolumeSpecName "kube-api-access-mfzkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.611701 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t" (OuterVolumeSpecName: "kube-api-access-zth6t") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "kube-api-access-zth6t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612254 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf" (OuterVolumeSpecName: "kube-api-access-6dmhf") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "kube-api-access-6dmhf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612252 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612276 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612881 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612979 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.612999 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config" (OuterVolumeSpecName: "config") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613273 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config" (OuterVolumeSpecName: "config") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613282 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613502 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq" (OuterVolumeSpecName: "kube-api-access-d4tqq") pod "6ee8fbd3-1f81-4666-96da-5afc70819f1a" (UID: "6ee8fbd3-1f81-4666-96da-5afc70819f1a"). InnerVolumeSpecName "kube-api-access-d4tqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613533 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613697 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613724 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap" (OuterVolumeSpecName: "whereabouts-flatfile-configmap") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "whereabouts-flatfile-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613860 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities" (OuterVolumeSpecName: "utilities") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613948 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613956 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.613997 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s" (OuterVolumeSpecName: "kube-api-access-xfp5s") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "kube-api-access-xfp5s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614187 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614220 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key" (OuterVolumeSpecName: "signing-key") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614274 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614506 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614515 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "6ee8fbd3-1f81-4666-96da-5afc70819f1a" (UID: "6ee8fbd3-1f81-4666-96da-5afc70819f1a"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.630365 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg" (OuterVolumeSpecName: "kube-api-access-hckvg") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "kube-api-access-hckvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.630649 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.630660 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6" (OuterVolumeSpecName: "kube-api-access-tkdh6") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "kube-api-access-tkdh6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.630898 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.630998 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.631032 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c" (OuterVolumeSpecName: "kube-api-access-8nb9c") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "kube-api-access-8nb9c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.631175 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc" (OuterVolumeSpecName: "kube-api-access-zg8nc") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "kube-api-access-zg8nc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.631413 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit" (OuterVolumeSpecName: "audit") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.631795 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.631695 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.614050 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-run-multus-certs\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.633360 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-slash\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634817 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c1239377-fc5d-40f2-b262-0b9c9448a3cf-tmp-dir\") pod \"node-resolver-ggxxl\" (UID: \"c1239377-fc5d-40f2-b262-0b9c9448a3cf\") " pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634827 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images" (OuterVolumeSpecName: "images") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634823 5184 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634282 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/766663a7-2c04-43da-a76f-dfacc5b1583a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634918 5184 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634933 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.634939 5184 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.634983 5184 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635007 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635022 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635036 5184 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635048 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635062 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635074 5184 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635088 5184 reconciler_common.go:299] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635101 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26xrl\" (UniqueName: \"kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635116 5184 reconciler_common.go:299] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635128 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twvbl\" (UniqueName: \"kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635142 5184 reconciler_common.go:299] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635138 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635154 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4g8ts\" (UniqueName: \"kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635173 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grwfz\" (UniqueName: \"kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635188 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635200 5184 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635215 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tknt7\" (UniqueName: \"kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635238 5184 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635253 5184 reconciler_common.go:299] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635507 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk" (OuterVolumeSpecName: "kube-api-access-qqbfk") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "kube-api-access-qqbfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635560 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635580 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7" (OuterVolumeSpecName: "kube-api-access-hm9x7") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "kube-api-access-hm9x7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635737 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca" (OuterVolumeSpecName: "service-ca") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635842 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.636053 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn" (OuterVolumeSpecName: "kube-api-access-xnxbn") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "kube-api-access-xnxbn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.636060 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.636086 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9z4sw\" (UniqueName: \"kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.636147 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.636515 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs podName:024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df nodeName:}" failed. No retries permitted until 2026-03-12 16:52:21.136484897 +0000 UTC m=+83.677796256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs") pod "network-metrics-daemon-vxc4c" (UID: "024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.636797 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.637507 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.638581 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.638601 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.638959 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp" (OuterVolumeSpecName: "tmp") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.639216 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.639235 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.639361 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config" (OuterVolumeSpecName: "console-config") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.639597 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh" (OuterVolumeSpecName: "kube-api-access-m5lgh") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "kube-api-access-m5lgh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.639737 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.639952 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config" (OuterVolumeSpecName: "config") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.639968 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.640750 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b45c859-3d05-4214-9bd3-2952546f5dea-proxy-tls\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.641234 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.641408 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-kubelet\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.641269 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv" (OuterVolumeSpecName: "kube-api-access-dztfv") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "kube-api-access-dztfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.641278 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk" (OuterVolumeSpecName: "kube-api-access-w94wk") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "kube-api-access-w94wk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.641345 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.641518 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-openvswitch\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.641938 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-multus-conf-dir\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642025 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642027 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w" (OuterVolumeSpecName: "kube-api-access-rzt4w") pod "a52afe44-fb37-46ed-a1f8-bf39727a3cbe" (UID: "a52afe44-fb37-46ed-a1f8-bf39727a3cbe"). InnerVolumeSpecName "kube-api-access-rzt4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642109 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp" (OuterVolumeSpecName: "tmp") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642136 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-run-netns\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642138 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642197 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-etc-kubernetes\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642250 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b45c859-3d05-4214-9bd3-2952546f5dea-mcd-auth-proxy-config\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642287 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642359 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h" (OuterVolumeSpecName: "kube-api-access-94l9h") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "kube-api-access-94l9h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642458 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-var-lib-cni-bin\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642552 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-var-lib-cni-multus\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642560 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9" (OuterVolumeSpecName: "kube-api-access-9vsz9") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "kube-api-access-9vsz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642751 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642794 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9" (OuterVolumeSpecName: "kube-api-access-99zj9") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "kube-api-access-99zj9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.642937 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.643118 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.643353 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.643484 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/542903c2-fc88-4085-979a-db3766958392-multus-daemon-config\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.643729 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/766663a7-2c04-43da-a76f-dfacc5b1583a-cni-binary-copy\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.643944 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/542903c2-fc88-4085-979a-db3766958392-cni-binary-copy\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.641783 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.643997 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.644009 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.644026 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.644055 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-systemd-units\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.635179 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-etc-openvswitch\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.644363 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.644527 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.644594 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.644545 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e576e89-2381-4f76-a33a-bcf82fa79b03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a528e6a13bd818b7d3bb0ef864934913eb0b6b9e7573f8f7840799a03c87c0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.644851 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-log-socket\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.644965 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-bin\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645007 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-netd\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645043 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-var-lib-openvswitch\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.643371 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645514 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-os-release\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645605 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-systemd\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.644705 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645362 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645519 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert" (OuterVolumeSpecName: "cert") pod "a52afe44-fb37-46ed-a1f8-bf39727a3cbe" (UID: "a52afe44-fb37-46ed-a1f8-bf39727a3cbe"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645628 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645754 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.648316 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-node-log\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645867 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-netns\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.648366 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/766663a7-2c04-43da-a76f-dfacc5b1583a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.648457 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.648493 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr" (OuterVolumeSpecName: "kube-api-access-wj4qr") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "kube-api-access-wj4qr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645801 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.648960 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.645784 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-env-overrides\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.650113 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.651346 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.651440 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.651455 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.651473 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.651485 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:20Z","lastTransitionTime":"2026-03-12T16:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.651836 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/542903c2-fc88-4085-979a-db3766958392-host-var-lib-kubelet\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.652162 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwc2f\" (UniqueName: \"kubernetes.io/projected/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-kube-api-access-bwc2f\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.654038 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ljt6\" (UniqueName: \"kubernetes.io/projected/7b45c859-3d05-4214-9bd3-2952546f5dea-kube-api-access-8ljt6\") pod \"machine-config-daemon-cp7pt\" (UID: \"7b45c859-3d05-4214-9bd3-2952546f5dea\") " pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.654653 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovn-node-metrics-cert\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.656484 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.658431 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trmc6\" (UniqueName: \"kubernetes.io/projected/766663a7-2c04-43da-a76f-dfacc5b1583a-kube-api-access-trmc6\") pod \"multus-additional-cni-plugins-ckfz2\" (UID: \"766663a7-2c04-43da-a76f-dfacc5b1583a\") " pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.659641 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config" (OuterVolumeSpecName: "config") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.659901 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7jw8\" (UniqueName: \"kubernetes.io/projected/9c823004-cd7d-4cea-9cdb-b44a806264ab-kube-api-access-w7jw8\") pod \"node-ca-tnk2c\" (UID: \"9c823004-cd7d-4cea-9cdb-b44a806264ab\") " pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.661532 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.662586 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.663320 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf2p5\" (UniqueName: \"kubernetes.io/projected/417740d6-e9c9-4fa8-9811-c6704b5b5692-kube-api-access-wf2p5\") pod \"ovnkube-control-plane-57b78d8988-wqfhs\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.663456 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz" (OuterVolumeSpecName: "kube-api-access-ws8zz") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "kube-api-access-ws8zz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.663763 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.664005 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.665488 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume" (OuterVolumeSpecName: "config-volume") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.665670 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.665942 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4cp\" (UniqueName: \"kubernetes.io/projected/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-kube-api-access-qz4cp\") pod \"ovnkube-node-6bpj2\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.665948 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djfvr\" (UniqueName: \"kubernetes.io/projected/542903c2-fc88-4085-979a-db3766958392-kube-api-access-djfvr\") pod \"multus-99gtj\" (UID: \"542903c2-fc88-4085-979a-db3766958392\") " pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.671897 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.673135 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.673217 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k47hx\" (UniqueName: \"kubernetes.io/projected/c1239377-fc5d-40f2-b262-0b9c9448a3cf-kube-api-access-k47hx\") pod \"node-resolver-ggxxl\" (UID: \"c1239377-fc5d-40f2-b262-0b9c9448a3cf\") " pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.684750 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122,Command:[/bin/bash -c set -xe Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ -f "/env/_master" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: set -o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: source "/env/_master" Mar 12 16:52:20 crc kubenswrapper[5184]: set +o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 12 16:52:20 crc kubenswrapper[5184]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 12 16:52:20 crc kubenswrapper[5184]: ho_enable="--enable-hybrid-overlay" Mar 12 16:52:20 crc kubenswrapper[5184]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 12 16:52:20 crc kubenswrapper[5184]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 12 16:52:20 crc kubenswrapper[5184]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 12 16:52:20 crc kubenswrapper[5184]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 12 16:52:20 crc kubenswrapper[5184]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --webhook-host=127.0.0.1 \ Mar 12 16:52:20 crc kubenswrapper[5184]: --webhook-port=9743 \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${ho_enable} \ Mar 12 16:52:20 crc kubenswrapper[5184]: --enable-interconnect \ Mar 12 16:52:20 crc kubenswrapper[5184]: --disable-approver \ Mar 12 16:52:20 crc kubenswrapper[5184]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --wait-for-kubernetes-api=200s \ Mar 12 16:52:20 crc kubenswrapper[5184]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --loglevel="${LOGLEVEL}" Mar 12 16:52:20 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nt2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000500000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-dgvkt_openshift-network-node-identity(fc4541ce-7789-4670-bc75-5c2868e52ce0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.687075 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766663a7-2c04-43da-a76f-dfacc5b1583a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ckfz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.687344 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.689028 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122,Command:[/bin/bash -c set -xe Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ -f "/env/_master" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: set -o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: source "/env/_master" Mar 12 16:52:20 crc kubenswrapper[5184]: set +o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 12 16:52:20 crc kubenswrapper[5184]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 12 16:52:20 crc kubenswrapper[5184]: --disable-webhook \ Mar 12 16:52:20 crc kubenswrapper[5184]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --loglevel="${LOGLEVEL}" Mar 12 16:52:20 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nt2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000500000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-dgvkt_openshift-network-node-identity(fc4541ce-7789-4670-bc75-5c2868e52ce0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.690346 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-dgvkt" podUID="fc4541ce-7789-4670-bc75-5c2868e52ce0" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.694627 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxc4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.694849 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.697750 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.707364 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bpj2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.717041 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff799ef2-41aa-4972-ae8f-6e29c01bbd76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d25459743a8e939a6fc0c89681dd4be8f2dbe697494adb6502228b2569ba616f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a1d4060b0813ec05d1dff25751605cd6ce575df8a1a0788b3331780009447967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1b42bc08c18390a5c946037634004c7dcb6cb14b92f56220260d8237aeedd629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.718619 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.720866 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"306b5927aec932a0fcf2bf3a034f2a32b7e67852a9bd79e9f8d13c064943de55"} Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.722925 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-99gtj" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.723970 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122,Command:[/bin/bash -c set -xe Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ -f "/env/_master" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: set -o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: source "/env/_master" Mar 12 16:52:20 crc kubenswrapper[5184]: set +o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 12 16:52:20 crc kubenswrapper[5184]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 12 16:52:20 crc kubenswrapper[5184]: ho_enable="--enable-hybrid-overlay" Mar 12 16:52:20 crc kubenswrapper[5184]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 12 16:52:20 crc kubenswrapper[5184]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 12 16:52:20 crc kubenswrapper[5184]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 12 16:52:20 crc kubenswrapper[5184]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 12 16:52:20 crc kubenswrapper[5184]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --webhook-host=127.0.0.1 \ Mar 12 16:52:20 crc kubenswrapper[5184]: --webhook-port=9743 \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${ho_enable} \ Mar 12 16:52:20 crc kubenswrapper[5184]: --enable-interconnect \ Mar 12 16:52:20 crc kubenswrapper[5184]: --disable-approver \ Mar 12 16:52:20 crc kubenswrapper[5184]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --wait-for-kubernetes-api=200s \ Mar 12 16:52:20 crc kubenswrapper[5184]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --loglevel="${LOGLEVEL}" Mar 12 16:52:20 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nt2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000500000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-dgvkt_openshift-network-node-identity(fc4541ce-7789-4670-bc75-5c2868e52ce0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.726947 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2beca127-92c3-4737-a680-69e0bf3936a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cb2f9a89582795adf9b1f2e114f29fbcca43ccc0ac07b56136f3c09a99af6c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ca237a33c864a01251750a3a9498ffbf76c02d7c465fec64514b916084eb3a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2bfb723f8c449cda9730d31e02d633c5bc26368677283970a7d7977e8b14823c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.728982 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122,Command:[/bin/bash -c set -xe Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ -f "/env/_master" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: set -o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: source "/env/_master" Mar 12 16:52:20 crc kubenswrapper[5184]: set +o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 12 16:52:20 crc kubenswrapper[5184]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 12 16:52:20 crc kubenswrapper[5184]: --disable-webhook \ Mar 12 16:52:20 crc kubenswrapper[5184]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --loglevel="${LOGLEVEL}" Mar 12 16:52:20 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nt2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000500000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-dgvkt_openshift-network-node-identity(fc4541ce-7789-4670-bc75-5c2868e52ce0): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.730345 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-dgvkt" podUID="fc4541ce-7789-4670-bc75-5c2868e52ce0" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.732223 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ggxxl" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737211 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w94wk\" (UniqueName: \"kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737249 5184 reconciler_common.go:299] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737268 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-99zj9\" (UniqueName: \"kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737288 5184 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737308 5184 reconciler_common.go:299] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737324 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737341 5184 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737357 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vsz9\" (UniqueName: \"kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: W0312 16:52:20.737288 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428b39f5_eb1c_4f65_b7a4_eeb6e84860cc.slice/crio-9da79ed3da4a1c43dacc4258e27beb6dd24e70d19634256c7e18f11d498cba10 WatchSource:0}: Error finding container 9da79ed3da4a1c43dacc4258e27beb6dd24e70d19634256c7e18f11d498cba10: Status 404 returned error can't find the container with id 9da79ed3da4a1c43dacc4258e27beb6dd24e70d19634256c7e18f11d498cba10 Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737403 5184 reconciler_common.go:299] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737421 5184 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737439 5184 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737456 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rzt4w\" (UniqueName: \"kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737473 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnxbn\" (UniqueName: \"kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737490 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737506 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zg8nc\" (UniqueName: \"kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737523 5184 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737543 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dztfv\" (UniqueName: \"kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737567 5184 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737591 5184 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737614 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737639 5184 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737661 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkdh6\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737685 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hckvg\" (UniqueName: \"kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737708 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737727 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dmhf\" (UniqueName: \"kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737746 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737762 5184 reconciler_common.go:299] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737778 5184 reconciler_common.go:299] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737794 5184 reconciler_common.go:299] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737811 5184 reconciler_common.go:299] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737827 5184 reconciler_common.go:299] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737844 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737862 5184 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737878 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737897 5184 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737913 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5lgh\" (UniqueName: \"kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737930 5184 reconciler_common.go:299] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737947 5184 reconciler_common.go:299] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737964 5184 reconciler_common.go:299] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737981 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.737997 5184 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738014 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738030 5184 reconciler_common.go:299] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738047 5184 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738066 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738083 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mjwtd\" (UniqueName: \"kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738100 5184 reconciler_common.go:299] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738115 5184 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738132 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738148 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738167 5184 reconciler_common.go:299] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738184 5184 reconciler_common.go:299] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738203 5184 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738220 5184 reconciler_common.go:299] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738238 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738255 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6g4lr\" (UniqueName: \"kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738272 5184 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738290 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738306 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nspp\" (UniqueName: \"kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738322 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738338 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738354 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zth6t\" (UniqueName: \"kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738400 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbc2l\" (UniqueName: \"kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738418 5184 reconciler_common.go:299] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738436 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738452 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738469 5184 reconciler_common.go:299] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738486 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5lcfw\" (UniqueName: \"kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738502 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738550 5184 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738568 5184 reconciler_common.go:299] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738584 5184 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738600 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmmzf\" (UniqueName: \"kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738618 5184 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738636 5184 reconciler_common.go:299] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738655 5184 reconciler_common.go:299] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738673 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738689 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738704 5184 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738721 5184 reconciler_common.go:299] "Volume detached for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738742 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738631 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766f2ece-d155-473b-bc1e-ceca5d270675\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T16:51:55Z\\\",\\\"message\\\":\\\"vvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InOrderInformers\\\\\\\" enabled=true\\\\nW0312 16:51:55.515939 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 16:51:55.516181 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0312 16:51:55.517475 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361147659/tls.crt::/tmp/serving-cert-1361147659/tls.key\\\\\\\"\\\\nI0312 16:51:55.968978 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 16:51:55.972281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 16:51:55.972308 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 16:51:55.972347 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 16:51:55.972358 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 16:51:55.979594 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0312 16:51:55.979642 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0312 16:51:55.979645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979674 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 16:51:55.979702 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 16:51:55.979708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 16:51:55.979715 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0312 16:51:55.981661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738759 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738916 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94l9h\" (UniqueName: \"kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738934 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738946 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738960 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qqbfk\" (UniqueName: \"kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738972 5184 reconciler_common.go:299] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738983 5184 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.738995 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739006 5184 reconciler_common.go:299] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739019 5184 reconciler_common.go:299] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739031 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739042 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739053 5184 reconciler_common.go:299] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739064 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nb9c\" (UniqueName: \"kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739077 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739090 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfzkj\" (UniqueName: \"kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739102 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: W0312 16:52:20.739093 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542903c2_fc88_4085_979a_db3766958392.slice/crio-be6531304f151cab500745a17c4a40497462de240e264d39f3b2e9904795b7cb WatchSource:0}: Error finding container be6531304f151cab500745a17c4a40497462de240e264d39f3b2e9904795b7cb: Status 404 returned error can't find the container with id be6531304f151cab500745a17c4a40497462de240e264d39f3b2e9904795b7cb Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739113 5184 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739173 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739195 5184 reconciler_common.go:299] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739213 5184 reconciler_common.go:299] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739231 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wj4qr\" (UniqueName: \"kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739248 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbmqg\" (UniqueName: \"kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739266 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pddnv\" (UniqueName: \"kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739287 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739304 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgx6b\" (UniqueName: \"kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739321 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hm9x7\" (UniqueName: \"kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739340 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4tqq\" (UniqueName: \"kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739358 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfp5s\" (UniqueName: \"kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739419 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pllx6\" (UniqueName: \"kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739441 5184 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739459 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739475 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739492 5184 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739509 5184 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739526 5184 reconciler_common.go:299] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739544 5184 reconciler_common.go:299] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739561 5184 reconciler_common.go:299] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739583 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ws8zz\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739600 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739619 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739636 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l87hs\" (UniqueName: \"kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739656 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739697 5184 reconciler_common.go:299] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739715 5184 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739733 5184 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739750 5184 reconciler_common.go:299] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739770 5184 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739789 5184 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739807 5184 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739823 5184 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739930 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739952 5184 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.739998 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.741207 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.741663 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 12 16:52:20 crc kubenswrapper[5184]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 12 16:52:20 crc kubenswrapper[5184]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djfvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-99gtj_openshift-multus(542903c2-fc88-4085-979a-db3766958392): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.742708 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dsgwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-5jnd7_openshift-network-operator(428b39f5-eb1c-4f65-b7a4-eeb6e84860cc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.743072 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-99gtj" podUID="542903c2-fc88-4085-979a-db3766958392" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.744618 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-5jnd7" podUID="428b39f5-eb1c-4f65-b7a4-eeb6e84860cc" Mar 12 16:52:20 crc kubenswrapper[5184]: W0312 16:52:20.745630 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1239377_fc5d_40f2_b262_0b9c9448a3cf.slice/crio-2d9096c932689317d31fdd55ac6b1327aabbbb4d15b709b6e6539e26989259ab WatchSource:0}: Error finding container 2d9096c932689317d31fdd55ac6b1327aabbbb4d15b709b6e6539e26989259ab: Status 404 returned error can't find the container with id 2d9096c932689317d31fdd55ac6b1327aabbbb4d15b709b6e6539e26989259ab Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.747195 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.748174 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e,Command:[/bin/bash -c #!/bin/bash Mar 12 16:52:20 crc kubenswrapper[5184]: set -uo pipefail Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 12 16:52:20 crc kubenswrapper[5184]: HOSTS_FILE="/etc/hosts" Mar 12 16:52:20 crc kubenswrapper[5184]: TEMP_FILE="/tmp/hosts.tmp" Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: # Make a temporary file with the old hosts file's attributes. Mar 12 16:52:20 crc kubenswrapper[5184]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 12 16:52:20 crc kubenswrapper[5184]: echo "Failed to preserve hosts file. Exiting." Mar 12 16:52:20 crc kubenswrapper[5184]: exit 1 Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: while true; do Mar 12 16:52:20 crc kubenswrapper[5184]: declare -A svc_ips Mar 12 16:52:20 crc kubenswrapper[5184]: for svc in "${services[@]}"; do Mar 12 16:52:20 crc kubenswrapper[5184]: # Fetch service IP from cluster dns if present. We make several tries Mar 12 16:52:20 crc kubenswrapper[5184]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 12 16:52:20 crc kubenswrapper[5184]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 12 16:52:20 crc kubenswrapper[5184]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 12 16:52:20 crc kubenswrapper[5184]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 16:52:20 crc kubenswrapper[5184]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 16:52:20 crc kubenswrapper[5184]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 16:52:20 crc kubenswrapper[5184]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 12 16:52:20 crc kubenswrapper[5184]: for i in ${!cmds[*]} Mar 12 16:52:20 crc kubenswrapper[5184]: do Mar 12 16:52:20 crc kubenswrapper[5184]: ips=($(eval "${cmds[i]}")) Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: svc_ips["${svc}"]="${ips[@]}" Mar 12 16:52:20 crc kubenswrapper[5184]: break Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: done Mar 12 16:52:20 crc kubenswrapper[5184]: done Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: # Update /etc/hosts only if we get valid service IPs Mar 12 16:52:20 crc kubenswrapper[5184]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 12 16:52:20 crc kubenswrapper[5184]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 12 16:52:20 crc kubenswrapper[5184]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 12 16:52:20 crc kubenswrapper[5184]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 12 16:52:20 crc kubenswrapper[5184]: sleep 60 & wait Mar 12 16:52:20 crc kubenswrapper[5184]: continue Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: # Append resolver entries for services Mar 12 16:52:20 crc kubenswrapper[5184]: rc=0 Mar 12 16:52:20 crc kubenswrapper[5184]: for svc in "${!svc_ips[@]}"; do Mar 12 16:52:20 crc kubenswrapper[5184]: for ip in ${svc_ips[${svc}]}; do Mar 12 16:52:20 crc kubenswrapper[5184]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 12 16:52:20 crc kubenswrapper[5184]: done Mar 12 16:52:20 crc kubenswrapper[5184]: done Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ $rc -ne 0 ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: sleep 60 & wait Mar 12 16:52:20 crc kubenswrapper[5184]: continue Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 12 16:52:20 crc kubenswrapper[5184]: # Replace /etc/hosts with our modified version if needed Mar 12 16:52:20 crc kubenswrapper[5184]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 12 16:52:20 crc kubenswrapper[5184]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: sleep 60 & wait Mar 12 16:52:20 crc kubenswrapper[5184]: unset svc_ips Mar 12 16:52:20 crc kubenswrapper[5184]: done Mar 12 16:52:20 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-dir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k47hx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-ggxxl_openshift-dns(c1239377-fc5d-40f2-b262-0b9c9448a3cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.749736 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-ggxxl" podUID="c1239377-fc5d-40f2-b262-0b9c9448a3cf" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.751015 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tnk2c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.753145 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.753169 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.753178 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.753193 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.753203 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:20Z","lastTransitionTime":"2026-03-12T16:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: W0312 16:52:20.753693 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b45c859_3d05_4214_9bd3_2952546f5dea.slice/crio-47dec37dc5b71cbd3b6e6e0821bcebfdecc92306484e58dc5478ed32160b4315 WatchSource:0}: Error finding container 47dec37dc5b71cbd3b6e6e0821bcebfdecc92306484e58dc5478ed32160b4315: Status 404 returned error can't find the container with id 47dec37dc5b71cbd3b6e6e0821bcebfdecc92306484e58dc5478ed32160b4315 Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.755457 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.20.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8ljt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.756307 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.756681 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.758022 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8ljt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.759193 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 16:52:20 crc kubenswrapper[5184]: W0312 16:52:20.764142 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c823004_cd7d_4cea_9cdb_b44a806264ab.slice/crio-cc5cfe2d26f86bdee8456f279320c68d7ef35713fe086101c223e9e832d19f37 WatchSource:0}: Error finding container cc5cfe2d26f86bdee8456f279320c68d7ef35713fe086101c223e9e832d19f37: Status 404 returned error can't find the container with id cc5cfe2d26f86bdee8456f279320c68d7ef35713fe086101c223e9e832d19f37 Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.764213 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.766516 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 12 16:52:20 crc kubenswrapper[5184]: while [ true ]; Mar 12 16:52:20 crc kubenswrapper[5184]: do Mar 12 16:52:20 crc kubenswrapper[5184]: for f in $(ls /tmp/serviceca); do Mar 12 16:52:20 crc kubenswrapper[5184]: echo $f Mar 12 16:52:20 crc kubenswrapper[5184]: ca_file_path="/tmp/serviceca/${f}" Mar 12 16:52:20 crc kubenswrapper[5184]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 12 16:52:20 crc kubenswrapper[5184]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 12 16:52:20 crc kubenswrapper[5184]: if [ -e "${reg_dir_path}" ]; then Mar 12 16:52:20 crc kubenswrapper[5184]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 12 16:52:20 crc kubenswrapper[5184]: else Mar 12 16:52:20 crc kubenswrapper[5184]: mkdir $reg_dir_path Mar 12 16:52:20 crc kubenswrapper[5184]: cp $ca_file_path $reg_dir_path/ca.crt Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: done Mar 12 16:52:20 crc kubenswrapper[5184]: for d in $(ls /etc/docker/certs.d); do Mar 12 16:52:20 crc kubenswrapper[5184]: echo $d Mar 12 16:52:20 crc kubenswrapper[5184]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 12 16:52:20 crc kubenswrapper[5184]: reg_conf_path="/tmp/serviceca/${dp}" Mar 12 16:52:20 crc kubenswrapper[5184]: if [ ! -e "${reg_conf_path}" ]; then Mar 12 16:52:20 crc kubenswrapper[5184]: rm -rf /etc/docker/certs.d/$d Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: done Mar 12 16:52:20 crc kubenswrapper[5184]: sleep 60 & wait ${!} Mar 12 16:52:20 crc kubenswrapper[5184]: done Mar 12 16:52:20 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7jw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-tnk2c_openshift-image-registry(9c823004-cd7d-4cea-9cdb-b44a806264ab): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.767760 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-tnk2c" podUID="9c823004-cd7d-4cea-9cdb-b44a806264ab" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.769727 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: W0312 16:52:20.769884 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod766663a7_2c04_43da_a76f_dfacc5b1583a.slice/crio-e2013d2c9bdc8f39c213cf13a5609dbd2a3c8fbfa08241e795885686ca10bf0e WatchSource:0}: Error finding container e2013d2c9bdc8f39c213cf13a5609dbd2a3c8fbfa08241e795885686ca10bf0e: Status 404 returned error can't find the container with id e2013d2c9bdc8f39c213cf13a5609dbd2a3c8fbfa08241e795885686ca10bf0e Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.771857 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.774780 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trmc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-ckfz2_openshift-multus(766663a7-2c04-43da-a76f-dfacc5b1583a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: W0312 16:52:20.775621 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod417740d6_e9c9_4fa8_9811_c6704b5b5692.slice/crio-6d4c192358714a12a140a09e2a1717ce35ef7ba0d88bbbf965f3d237341e2cea WatchSource:0}: Error finding container 6d4c192358714a12a140a09e2a1717ce35ef7ba0d88bbbf965f3d237341e2cea: Status 404 returned error can't find the container with id 6d4c192358714a12a140a09e2a1717ce35ef7ba0d88bbbf965f3d237341e2cea Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.776523 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" podUID="766663a7-2c04-43da-a76f-dfacc5b1583a" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.777966 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5,Command:[/bin/bash -c #!/bin/bash Mar 12 16:52:20 crc kubenswrapper[5184]: set -euo pipefail Mar 12 16:52:20 crc kubenswrapper[5184]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 12 16:52:20 crc kubenswrapper[5184]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 12 16:52:20 crc kubenswrapper[5184]: # As the secret mount is optional we must wait for the files to be present. Mar 12 16:52:20 crc kubenswrapper[5184]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 12 16:52:20 crc kubenswrapper[5184]: TS=$(date +%s) Mar 12 16:52:20 crc kubenswrapper[5184]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 12 16:52:20 crc kubenswrapper[5184]: HAS_LOGGED_INFO=0 Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: log_missing_certs(){ Mar 12 16:52:20 crc kubenswrapper[5184]: CUR_TS=$(date +%s) Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 12 16:52:20 crc kubenswrapper[5184]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 12 16:52:20 crc kubenswrapper[5184]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 12 16:52:20 crc kubenswrapper[5184]: HAS_LOGGED_INFO=1 Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: } Mar 12 16:52:20 crc kubenswrapper[5184]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 12 16:52:20 crc kubenswrapper[5184]: log_missing_certs Mar 12 16:52:20 crc kubenswrapper[5184]: sleep 5 Mar 12 16:52:20 crc kubenswrapper[5184]: done Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 12 16:52:20 crc kubenswrapper[5184]: exec /usr/bin/kube-rbac-proxy \ Mar 12 16:52:20 crc kubenswrapper[5184]: --logtostderr \ Mar 12 16:52:20 crc kubenswrapper[5184]: --secure-listen-address=:9108 \ Mar 12 16:52:20 crc kubenswrapper[5184]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 12 16:52:20 crc kubenswrapper[5184]: --upstream=http://127.0.0.1:29108/ \ Mar 12 16:52:20 crc kubenswrapper[5184]: --tls-private-key-file=${TLS_PK} \ Mar 12 16:52:20 crc kubenswrapper[5184]: --tls-cert-file=${TLS_CERT} Mar 12 16:52:20 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wf2p5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-57b78d8988-wqfhs_openshift-ovn-kubernetes(417740d6-e9c9-4fa8-9811-c6704b5b5692): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.780108 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122,Command:[/bin/bash -c set -xe Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ -f "/env/_master" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: set -o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: source "/env/_master" Mar 12 16:52:20 crc kubenswrapper[5184]: set +o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: ovn_v4_join_subnet_opt= Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "" != "" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: ovn_v6_join_subnet_opt= Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "" != "" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: ovn_v4_transit_switch_subnet_opt= Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "" != "" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: ovn_v6_transit_switch_subnet_opt= Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "" != "" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: dns_name_resolver_enabled_flag= Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "false" == "true" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: # This is needed so that converting clusters from GA to TP Mar 12 16:52:20 crc kubenswrapper[5184]: # will rollout control plane pods as well Mar 12 16:52:20 crc kubenswrapper[5184]: network_segmentation_enabled_flag= Mar 12 16:52:20 crc kubenswrapper[5184]: multi_network_enabled_flag= Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "true" == "true" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: multi_network_enabled_flag="--enable-multi-network" Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "true" == "true" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "true" != "true" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: multi_network_enabled_flag="--enable-multi-network" Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: route_advertisements_enable_flag= Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "false" == "true" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: route_advertisements_enable_flag="--enable-route-advertisements" Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: preconfigured_udn_addresses_enable_flag= Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "false" == "true" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: preconfigured_udn_addresses_enable_flag="--enable-preconfigured-udn-addresses" Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: # Enable multi-network policy if configured (control-plane always full mode) Mar 12 16:52:20 crc kubenswrapper[5184]: multi_network_policy_enabled_flag= Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "false" == "true" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: multi_network_policy_enabled_flag="--enable-multi-networkpolicy" Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: # Enable admin network policy if configured (control-plane always full mode) Mar 12 16:52:20 crc kubenswrapper[5184]: admin_network_policy_enabled_flag= Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ "true" == "true" ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: admin_network_policy_enabled_flag="--enable-admin-network-policy" Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: if [ "shared" == "shared" ]; then Mar 12 16:52:20 crc kubenswrapper[5184]: gateway_mode_flags="--gateway-mode shared" Mar 12 16:52:20 crc kubenswrapper[5184]: elif [ "shared" == "local" ]; then Mar 12 16:52:20 crc kubenswrapper[5184]: gateway_mode_flags="--gateway-mode local" Mar 12 16:52:20 crc kubenswrapper[5184]: else Mar 12 16:52:20 crc kubenswrapper[5184]: echo "Invalid OVN_GATEWAY_MODE: \"shared\". Must be \"local\" or \"shared\"." Mar 12 16:52:20 crc kubenswrapper[5184]: exit 1 Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: Mar 12 16:52:20 crc kubenswrapper[5184]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 12 16:52:20 crc kubenswrapper[5184]: exec /usr/bin/ovnkube \ Mar 12 16:52:20 crc kubenswrapper[5184]: --enable-interconnect \ Mar 12 16:52:20 crc kubenswrapper[5184]: --init-cluster-manager "${K8S_NODE}" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 12 16:52:20 crc kubenswrapper[5184]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --metrics-bind-address "127.0.0.1:29108" \ Mar 12 16:52:20 crc kubenswrapper[5184]: --metrics-enable-pprof \ Mar 12 16:52:20 crc kubenswrapper[5184]: --metrics-enable-config-duration \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${ovn_v4_join_subnet_opt} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${ovn_v6_join_subnet_opt} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${dns_name_resolver_enabled_flag} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${persistent_ips_enabled_flag} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${multi_network_enabled_flag} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${network_segmentation_enabled_flag} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${gateway_mode_flags} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${route_advertisements_enable_flag} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${preconfigured_udn_addresses_enable_flag} \ Mar 12 16:52:20 crc kubenswrapper[5184]: --enable-egress-ip=true \ Mar 12 16:52:20 crc kubenswrapper[5184]: --enable-egress-firewall=true \ Mar 12 16:52:20 crc kubenswrapper[5184]: --enable-egress-qos=true \ Mar 12 16:52:20 crc kubenswrapper[5184]: --enable-egress-service=true \ Mar 12 16:52:20 crc kubenswrapper[5184]: --enable-multicast \ Mar 12 16:52:20 crc kubenswrapper[5184]: --enable-multi-external-gateway=true \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${multi_network_policy_enabled_flag} \ Mar 12 16:52:20 crc kubenswrapper[5184]: ${admin_network_policy_enabled_flag} Mar 12 16:52:20 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wf2p5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-57b78d8988-wqfhs_openshift-ovn-kubernetes(417740d6-e9c9-4fa8-9811-c6704b5b5692): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: W0312 16:52:20.780836 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda92c8326_e582_4692_8b35_c5d5dbc1ff6c.slice/crio-8a4ae9c85a6a7f907c79ddfdbd329c4856ddbf46112ed3682519102c877c78d1 WatchSource:0}: Error finding container 8a4ae9c85a6a7f907c79ddfdbd329c4856ddbf46112ed3682519102c877c78d1: Status 404 returned error can't find the container with id 8a4ae9c85a6a7f907c79ddfdbd329c4856ddbf46112ed3682519102c877c78d1 Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.781356 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.783708 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 12 16:52:20 crc kubenswrapper[5184]: apiVersion: v1 Mar 12 16:52:20 crc kubenswrapper[5184]: clusters: Mar 12 16:52:20 crc kubenswrapper[5184]: - cluster: Mar 12 16:52:20 crc kubenswrapper[5184]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 12 16:52:20 crc kubenswrapper[5184]: server: https://api-int.crc.testing:6443 Mar 12 16:52:20 crc kubenswrapper[5184]: name: default-cluster Mar 12 16:52:20 crc kubenswrapper[5184]: contexts: Mar 12 16:52:20 crc kubenswrapper[5184]: - context: Mar 12 16:52:20 crc kubenswrapper[5184]: cluster: default-cluster Mar 12 16:52:20 crc kubenswrapper[5184]: namespace: default Mar 12 16:52:20 crc kubenswrapper[5184]: user: default-auth Mar 12 16:52:20 crc kubenswrapper[5184]: name: default-context Mar 12 16:52:20 crc kubenswrapper[5184]: current-context: default-context Mar 12 16:52:20 crc kubenswrapper[5184]: kind: Config Mar 12 16:52:20 crc kubenswrapper[5184]: preferences: {} Mar 12 16:52:20 crc kubenswrapper[5184]: users: Mar 12 16:52:20 crc kubenswrapper[5184]: - name: default-auth Mar 12 16:52:20 crc kubenswrapper[5184]: user: Mar 12 16:52:20 crc kubenswrapper[5184]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 12 16:52:20 crc kubenswrapper[5184]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 12 16:52:20 crc kubenswrapper[5184]: EOF Mar 12 16:52:20 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qz4cp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-6bpj2_openshift-ovn-kubernetes(a92c8326-e582-4692-8b35-c5d5dbc1ff6c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.783979 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.785802 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.795410 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-99gtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"542903c2-fc88-4085-979a-db3766958392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djfvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-99gtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.803847 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-ggxxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1239377-fc5d-40f2-b262-0b9c9448a3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k47hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ggxxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.810236 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c823004-cd7d-4cea-9cdb-b44a806264ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7jw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.819578 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417740d6-e9c9-4fa8-9811-c6704b5b5692\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-wqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.827103 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.834060 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b45c859-3d05-4214-9bd3-2952546f5dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp7pt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.847740 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f335ad31-84ab-4bea-b0f2-75eca434a55d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://7901646b9904fb6a100644a0aacd978a71373a764eea536a29abd51530037c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://850d2c82f02982ba13abc9b9365f5be589329d37001cd14054004a85c6d2e96a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ad74805d602978f7f836e80d83b2ef81f7c4cbbc65155a2268057928abb2f906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://7696ac1bb100db5ea88c53ca29f38064d11d2600b968872de07c222ad6411720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e7c8113246b3ae53a45615d0812c94a8ef10af18f75a421afdf0afa3ecb09223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.855588 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.855641 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.855650 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.855662 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.855673 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:20Z","lastTransitionTime":"2026-03-12T16:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.872163 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e576e89-2381-4f76-a33a-bcf82fa79b03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a528e6a13bd818b7d3bb0ef864934913eb0b6b9e7573f8f7840799a03c87c0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.915314 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.949765 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.954933 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.957166 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.957207 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.957216 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.957229 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.957238 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:20Z","lastTransitionTime":"2026-03-12T16:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:20 crc kubenswrapper[5184]: W0312 16:52:20.960641 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34177974_8d82_49d2_a763_391d0df3bbd8.slice/crio-107e1a90c47bc404bc9f48d692462a93aa56522508fb9595cd6d5864f3b1f0d3 WatchSource:0}: Error finding container 107e1a90c47bc404bc9f48d692462a93aa56522508fb9595cd6d5864f3b1f0d3: Status 404 returned error can't find the container with id 107e1a90c47bc404bc9f48d692462a93aa56522508fb9595cd6d5864f3b1f0d3 Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.962401 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:20 crc kubenswrapper[5184]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b,Command:[/bin/bash -c #!/bin/bash Mar 12 16:52:20 crc kubenswrapper[5184]: set -o allexport Mar 12 16:52:20 crc kubenswrapper[5184]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 12 16:52:20 crc kubenswrapper[5184]: source /etc/kubernetes/apiserver-url.env Mar 12 16:52:20 crc kubenswrapper[5184]: else Mar 12 16:52:20 crc kubenswrapper[5184]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 12 16:52:20 crc kubenswrapper[5184]: exit 1 Mar 12 16:52:20 crc kubenswrapper[5184]: fi Mar 12 16:52:20 crc kubenswrapper[5184]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 12 16:52:20 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.20.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951276a60f15185a05902cf1ec49b6db3e4f049ec638828b336aed496f8dfc45,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b5000f8f055fd8f734ef74afbd9bd5333a38345cbc4959ddaad728b8394bccd4,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be136d591a0eeb3f7bedf04aabb5481a23b6645316d5cef3cd5be1787344c2b5,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91997a073272252cac9cd31915ec74217637c55d1abc725107c6eb677ddddc9b,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6a974f04d4aefdb39bf2d4649b24e7e0e87685afa3d07ca46234f1a0c5688e4b,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7xz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-7bdcf4f5bd-7fjxv_openshift-network-operator(34177974-8d82-49d2-a763-391d0df3bbd8): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:20 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:20 crc kubenswrapper[5184]: E0312 16:52:20.963562 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" podUID="34177974-8d82-49d2-a763-391d0df3bbd8" Mar 12 16:52:20 crc kubenswrapper[5184]: I0312 16:52:20.995564 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766663a7-2c04-43da-a76f-dfacc5b1583a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ckfz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.035776 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxc4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.042222 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.042349 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042428 5184 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042458 5184 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.042432 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042503 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:22.042485041 +0000 UTC m=+84.583796390 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042537 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042559 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042577 5184 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042589 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:22.042540473 +0000 UTC m=+84.583851822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042640 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:22.042628526 +0000 UTC m=+84.583939885 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.042753 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042858 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042869 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042877 5184 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.042906 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:22.042898684 +0000 UTC m=+84.584210023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.059280 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.059323 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.059335 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.059350 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.059359 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:21Z","lastTransitionTime":"2026-03-12T16:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.087281 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bpj2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.116455 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff799ef2-41aa-4972-ae8f-6e29c01bbd76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d25459743a8e939a6fc0c89681dd4be8f2dbe697494adb6502228b2569ba616f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a1d4060b0813ec05d1dff25751605cd6ce575df8a1a0788b3331780009447967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1b42bc08c18390a5c946037634004c7dcb6cb14b92f56220260d8237aeedd629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.143901 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.144128 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:22.144086283 +0000 UTC m=+84.685397662 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.144431 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.144616 5184 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.144719 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs podName:024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df nodeName:}" failed. No retries permitted until 2026-03-12 16:52:22.144694412 +0000 UTC m=+84.686005801 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs") pod "network-metrics-daemon-vxc4c" (UID: "024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.159070 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2beca127-92c3-4737-a680-69e0bf3936a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cb2f9a89582795adf9b1f2e114f29fbcca43ccc0ac07b56136f3c09a99af6c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ca237a33c864a01251750a3a9498ffbf76c02d7c465fec64514b916084eb3a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2bfb723f8c449cda9730d31e02d633c5bc26368677283970a7d7977e8b14823c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.161641 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.161706 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.161724 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.161751 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.161769 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:21Z","lastTransitionTime":"2026-03-12T16:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.201668 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766f2ece-d155-473b-bc1e-ceca5d270675\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T16:51:55Z\\\",\\\"message\\\":\\\"vvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InOrderInformers\\\\\\\" enabled=true\\\\nW0312 16:51:55.515939 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 16:51:55.516181 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0312 16:51:55.517475 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361147659/tls.crt::/tmp/serving-cert-1361147659/tls.key\\\\\\\"\\\\nI0312 16:51:55.968978 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 16:51:55.972281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 16:51:55.972308 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 16:51:55.972347 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 16:51:55.972358 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 16:51:55.979594 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0312 16:51:55.979642 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0312 16:51:55.979645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979674 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 16:51:55.979702 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 16:51:55.979708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 16:51:55.979715 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0312 16:51:55.981661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.237627 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.263942 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.264195 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.264287 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.264350 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.264424 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:21Z","lastTransitionTime":"2026-03-12T16:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.367471 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.367535 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.367554 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.367578 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.367596 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:21Z","lastTransitionTime":"2026-03-12T16:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.469580 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.469645 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.469663 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.469689 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.469707 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:21Z","lastTransitionTime":"2026-03-12T16:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.571915 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.571959 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.571971 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.571987 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.571999 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:21Z","lastTransitionTime":"2026-03-12T16:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.673972 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.674050 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.674081 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.674131 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.674156 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:21Z","lastTransitionTime":"2026-03-12T16:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.724143 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" event={"ID":"34177974-8d82-49d2-a763-391d0df3bbd8","Type":"ContainerStarted","Data":"107e1a90c47bc404bc9f48d692462a93aa56522508fb9595cd6d5864f3b1f0d3"} Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.726424 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:21 crc kubenswrapper[5184]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b,Command:[/bin/bash -c #!/bin/bash Mar 12 16:52:21 crc kubenswrapper[5184]: set -o allexport Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: source /etc/kubernetes/apiserver-url.env Mar 12 16:52:21 crc kubenswrapper[5184]: else Mar 12 16:52:21 crc kubenswrapper[5184]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 12 16:52:21 crc kubenswrapper[5184]: exit 1 Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 12 16:52:21 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.20.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951276a60f15185a05902cf1ec49b6db3e4f049ec638828b336aed496f8dfc45,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b5000f8f055fd8f734ef74afbd9bd5333a38345cbc4959ddaad728b8394bccd4,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be136d591a0eeb3f7bedf04aabb5481a23b6645316d5cef3cd5be1787344c2b5,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91997a073272252cac9cd31915ec74217637c55d1abc725107c6eb677ddddc9b,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6a974f04d4aefdb39bf2d4649b24e7e0e87685afa3d07ca46234f1a0c5688e4b,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7xz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-7bdcf4f5bd-7fjxv_openshift-network-operator(34177974-8d82-49d2-a763-391d0df3bbd8): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:21 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.726481 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerStarted","Data":"8a4ae9c85a6a7f907c79ddfdbd329c4856ddbf46112ed3682519102c877c78d1"} Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.727817 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" podUID="34177974-8d82-49d2-a763-391d0df3bbd8" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.728303 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" event={"ID":"417740d6-e9c9-4fa8-9811-c6704b5b5692","Type":"ContainerStarted","Data":"6d4c192358714a12a140a09e2a1717ce35ef7ba0d88bbbf965f3d237341e2cea"} Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.729271 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:21 crc kubenswrapper[5184]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 12 16:52:21 crc kubenswrapper[5184]: apiVersion: v1 Mar 12 16:52:21 crc kubenswrapper[5184]: clusters: Mar 12 16:52:21 crc kubenswrapper[5184]: - cluster: Mar 12 16:52:21 crc kubenswrapper[5184]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 12 16:52:21 crc kubenswrapper[5184]: server: https://api-int.crc.testing:6443 Mar 12 16:52:21 crc kubenswrapper[5184]: name: default-cluster Mar 12 16:52:21 crc kubenswrapper[5184]: contexts: Mar 12 16:52:21 crc kubenswrapper[5184]: - context: Mar 12 16:52:21 crc kubenswrapper[5184]: cluster: default-cluster Mar 12 16:52:21 crc kubenswrapper[5184]: namespace: default Mar 12 16:52:21 crc kubenswrapper[5184]: user: default-auth Mar 12 16:52:21 crc kubenswrapper[5184]: name: default-context Mar 12 16:52:21 crc kubenswrapper[5184]: current-context: default-context Mar 12 16:52:21 crc kubenswrapper[5184]: kind: Config Mar 12 16:52:21 crc kubenswrapper[5184]: preferences: {} Mar 12 16:52:21 crc kubenswrapper[5184]: users: Mar 12 16:52:21 crc kubenswrapper[5184]: - name: default-auth Mar 12 16:52:21 crc kubenswrapper[5184]: user: Mar 12 16:52:21 crc kubenswrapper[5184]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 12 16:52:21 crc kubenswrapper[5184]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 12 16:52:21 crc kubenswrapper[5184]: EOF Mar 12 16:52:21 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qz4cp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-6bpj2_openshift-ovn-kubernetes(a92c8326-e582-4692-8b35-c5d5dbc1ff6c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:21 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.730139 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:21 crc kubenswrapper[5184]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5,Command:[/bin/bash -c #!/bin/bash Mar 12 16:52:21 crc kubenswrapper[5184]: set -euo pipefail Mar 12 16:52:21 crc kubenswrapper[5184]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 12 16:52:21 crc kubenswrapper[5184]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 12 16:52:21 crc kubenswrapper[5184]: # As the secret mount is optional we must wait for the files to be present. Mar 12 16:52:21 crc kubenswrapper[5184]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 12 16:52:21 crc kubenswrapper[5184]: TS=$(date +%s) Mar 12 16:52:21 crc kubenswrapper[5184]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 12 16:52:21 crc kubenswrapper[5184]: HAS_LOGGED_INFO=0 Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: log_missing_certs(){ Mar 12 16:52:21 crc kubenswrapper[5184]: CUR_TS=$(date +%s) Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 12 16:52:21 crc kubenswrapper[5184]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 12 16:52:21 crc kubenswrapper[5184]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 12 16:52:21 crc kubenswrapper[5184]: HAS_LOGGED_INFO=1 Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: } Mar 12 16:52:21 crc kubenswrapper[5184]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 12 16:52:21 crc kubenswrapper[5184]: log_missing_certs Mar 12 16:52:21 crc kubenswrapper[5184]: sleep 5 Mar 12 16:52:21 crc kubenswrapper[5184]: done Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 12 16:52:21 crc kubenswrapper[5184]: exec /usr/bin/kube-rbac-proxy \ Mar 12 16:52:21 crc kubenswrapper[5184]: --logtostderr \ Mar 12 16:52:21 crc kubenswrapper[5184]: --secure-listen-address=:9108 \ Mar 12 16:52:21 crc kubenswrapper[5184]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 12 16:52:21 crc kubenswrapper[5184]: --upstream=http://127.0.0.1:29108/ \ Mar 12 16:52:21 crc kubenswrapper[5184]: --tls-private-key-file=${TLS_PK} \ Mar 12 16:52:21 crc kubenswrapper[5184]: --tls-cert-file=${TLS_CERT} Mar 12 16:52:21 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wf2p5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-57b78d8988-wqfhs_openshift-ovn-kubernetes(417740d6-e9c9-4fa8-9811-c6704b5b5692): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:21 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.730462 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.730628 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"47dec37dc5b71cbd3b6e6e0821bcebfdecc92306484e58dc5478ed32160b4315"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.732694 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" event={"ID":"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc","Type":"ContainerStarted","Data":"9da79ed3da4a1c43dacc4258e27beb6dd24e70d19634256c7e18f11d498cba10"} Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.733566 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:21 crc kubenswrapper[5184]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122,Command:[/bin/bash -c set -xe Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ -f "/env/_master" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: set -o allexport Mar 12 16:52:21 crc kubenswrapper[5184]: source "/env/_master" Mar 12 16:52:21 crc kubenswrapper[5184]: set +o allexport Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: ovn_v4_join_subnet_opt= Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "" != "" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: ovn_v6_join_subnet_opt= Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "" != "" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: ovn_v4_transit_switch_subnet_opt= Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "" != "" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: ovn_v6_transit_switch_subnet_opt= Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "" != "" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: dns_name_resolver_enabled_flag= Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "false" == "true" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: # This is needed so that converting clusters from GA to TP Mar 12 16:52:21 crc kubenswrapper[5184]: # will rollout control plane pods as well Mar 12 16:52:21 crc kubenswrapper[5184]: network_segmentation_enabled_flag= Mar 12 16:52:21 crc kubenswrapper[5184]: multi_network_enabled_flag= Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "true" == "true" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: multi_network_enabled_flag="--enable-multi-network" Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "true" == "true" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "true" != "true" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: multi_network_enabled_flag="--enable-multi-network" Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: route_advertisements_enable_flag= Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "false" == "true" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: route_advertisements_enable_flag="--enable-route-advertisements" Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: preconfigured_udn_addresses_enable_flag= Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "false" == "true" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: preconfigured_udn_addresses_enable_flag="--enable-preconfigured-udn-addresses" Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: # Enable multi-network policy if configured (control-plane always full mode) Mar 12 16:52:21 crc kubenswrapper[5184]: multi_network_policy_enabled_flag= Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "false" == "true" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: multi_network_policy_enabled_flag="--enable-multi-networkpolicy" Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: # Enable admin network policy if configured (control-plane always full mode) Mar 12 16:52:21 crc kubenswrapper[5184]: admin_network_policy_enabled_flag= Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "true" == "true" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: admin_network_policy_enabled_flag="--enable-admin-network-policy" Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: if [ "shared" == "shared" ]; then Mar 12 16:52:21 crc kubenswrapper[5184]: gateway_mode_flags="--gateway-mode shared" Mar 12 16:52:21 crc kubenswrapper[5184]: elif [ "shared" == "local" ]; then Mar 12 16:52:21 crc kubenswrapper[5184]: gateway_mode_flags="--gateway-mode local" Mar 12 16:52:21 crc kubenswrapper[5184]: else Mar 12 16:52:21 crc kubenswrapper[5184]: echo "Invalid OVN_GATEWAY_MODE: \"shared\". Must be \"local\" or \"shared\"." Mar 12 16:52:21 crc kubenswrapper[5184]: exit 1 Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 12 16:52:21 crc kubenswrapper[5184]: exec /usr/bin/ovnkube \ Mar 12 16:52:21 crc kubenswrapper[5184]: --enable-interconnect \ Mar 12 16:52:21 crc kubenswrapper[5184]: --init-cluster-manager "${K8S_NODE}" \ Mar 12 16:52:21 crc kubenswrapper[5184]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 12 16:52:21 crc kubenswrapper[5184]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 12 16:52:21 crc kubenswrapper[5184]: --metrics-bind-address "127.0.0.1:29108" \ Mar 12 16:52:21 crc kubenswrapper[5184]: --metrics-enable-pprof \ Mar 12 16:52:21 crc kubenswrapper[5184]: --metrics-enable-config-duration \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${ovn_v4_join_subnet_opt} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${ovn_v6_join_subnet_opt} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${dns_name_resolver_enabled_flag} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${persistent_ips_enabled_flag} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${multi_network_enabled_flag} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${network_segmentation_enabled_flag} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${gateway_mode_flags} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${route_advertisements_enable_flag} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${preconfigured_udn_addresses_enable_flag} \ Mar 12 16:52:21 crc kubenswrapper[5184]: --enable-egress-ip=true \ Mar 12 16:52:21 crc kubenswrapper[5184]: --enable-egress-firewall=true \ Mar 12 16:52:21 crc kubenswrapper[5184]: --enable-egress-qos=true \ Mar 12 16:52:21 crc kubenswrapper[5184]: --enable-egress-service=true \ Mar 12 16:52:21 crc kubenswrapper[5184]: --enable-multicast \ Mar 12 16:52:21 crc kubenswrapper[5184]: --enable-multi-external-gateway=true \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${multi_network_policy_enabled_flag} \ Mar 12 16:52:21 crc kubenswrapper[5184]: ${admin_network_policy_enabled_flag} Mar 12 16:52:21 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wf2p5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-57b78d8988-wqfhs_openshift-ovn-kubernetes(417740d6-e9c9-4fa8-9811-c6704b5b5692): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:21 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.735200 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dsgwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-5jnd7_openshift-network-operator(428b39f5-eb1c-4f65-b7a4-eeb6e84860cc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.735509 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.735208 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.20.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8ljt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.736093 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tnk2c" event={"ID":"9c823004-cd7d-4cea-9cdb-b44a806264ab","Type":"ContainerStarted","Data":"cc5cfe2d26f86bdee8456f279320c68d7ef35713fe086101c223e9e832d19f37"} Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.736412 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-5jnd7" podUID="428b39f5-eb1c-4f65-b7a4-eeb6e84860cc" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.738005 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ggxxl" event={"ID":"c1239377-fc5d-40f2-b262-0b9c9448a3cf","Type":"ContainerStarted","Data":"2d9096c932689317d31fdd55ac6b1327aabbbb4d15b709b6e6539e26989259ab"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.740720 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-99gtj" event={"ID":"542903c2-fc88-4085-979a-db3766958392","Type":"ContainerStarted","Data":"be6531304f151cab500745a17c4a40497462de240e264d39f3b2e9904795b7cb"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.741656 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.741700 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:21 crc kubenswrapper[5184]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 12 16:52:21 crc kubenswrapper[5184]: while [ true ]; Mar 12 16:52:21 crc kubenswrapper[5184]: do Mar 12 16:52:21 crc kubenswrapper[5184]: for f in $(ls /tmp/serviceca); do Mar 12 16:52:21 crc kubenswrapper[5184]: echo $f Mar 12 16:52:21 crc kubenswrapper[5184]: ca_file_path="/tmp/serviceca/${f}" Mar 12 16:52:21 crc kubenswrapper[5184]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 12 16:52:21 crc kubenswrapper[5184]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 12 16:52:21 crc kubenswrapper[5184]: if [ -e "${reg_dir_path}" ]; then Mar 12 16:52:21 crc kubenswrapper[5184]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 12 16:52:21 crc kubenswrapper[5184]: else Mar 12 16:52:21 crc kubenswrapper[5184]: mkdir $reg_dir_path Mar 12 16:52:21 crc kubenswrapper[5184]: cp $ca_file_path $reg_dir_path/ca.crt Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: done Mar 12 16:52:21 crc kubenswrapper[5184]: for d in $(ls /etc/docker/certs.d); do Mar 12 16:52:21 crc kubenswrapper[5184]: echo $d Mar 12 16:52:21 crc kubenswrapper[5184]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 12 16:52:21 crc kubenswrapper[5184]: reg_conf_path="/tmp/serviceca/${dp}" Mar 12 16:52:21 crc kubenswrapper[5184]: if [ ! -e "${reg_conf_path}" ]; then Mar 12 16:52:21 crc kubenswrapper[5184]: rm -rf /etc/docker/certs.d/$d Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: done Mar 12 16:52:21 crc kubenswrapper[5184]: sleep 60 & wait ${!} Mar 12 16:52:21 crc kubenswrapper[5184]: done Mar 12 16:52:21 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7jw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-tnk2c_openshift-image-registry(9c823004-cd7d-4cea-9cdb-b44a806264ab): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:21 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.742078 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" event={"ID":"766663a7-2c04-43da-a76f-dfacc5b1583a","Type":"ContainerStarted","Data":"e2013d2c9bdc8f39c213cf13a5609dbd2a3c8fbfa08241e795885686ca10bf0e"} Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.742910 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-tnk2c" podUID="9c823004-cd7d-4cea-9cdb-b44a806264ab" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.743046 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8ljt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.743611 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:21 crc kubenswrapper[5184]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e,Command:[/bin/bash -c #!/bin/bash Mar 12 16:52:21 crc kubenswrapper[5184]: set -uo pipefail Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 12 16:52:21 crc kubenswrapper[5184]: HOSTS_FILE="/etc/hosts" Mar 12 16:52:21 crc kubenswrapper[5184]: TEMP_FILE="/tmp/hosts.tmp" Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: # Make a temporary file with the old hosts file's attributes. Mar 12 16:52:21 crc kubenswrapper[5184]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 12 16:52:21 crc kubenswrapper[5184]: echo "Failed to preserve hosts file. Exiting." Mar 12 16:52:21 crc kubenswrapper[5184]: exit 1 Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: while true; do Mar 12 16:52:21 crc kubenswrapper[5184]: declare -A svc_ips Mar 12 16:52:21 crc kubenswrapper[5184]: for svc in "${services[@]}"; do Mar 12 16:52:21 crc kubenswrapper[5184]: # Fetch service IP from cluster dns if present. We make several tries Mar 12 16:52:21 crc kubenswrapper[5184]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 12 16:52:21 crc kubenswrapper[5184]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 12 16:52:21 crc kubenswrapper[5184]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 12 16:52:21 crc kubenswrapper[5184]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 16:52:21 crc kubenswrapper[5184]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 16:52:21 crc kubenswrapper[5184]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 12 16:52:21 crc kubenswrapper[5184]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 12 16:52:21 crc kubenswrapper[5184]: for i in ${!cmds[*]} Mar 12 16:52:21 crc kubenswrapper[5184]: do Mar 12 16:52:21 crc kubenswrapper[5184]: ips=($(eval "${cmds[i]}")) Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: svc_ips["${svc}"]="${ips[@]}" Mar 12 16:52:21 crc kubenswrapper[5184]: break Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: done Mar 12 16:52:21 crc kubenswrapper[5184]: done Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: # Update /etc/hosts only if we get valid service IPs Mar 12 16:52:21 crc kubenswrapper[5184]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 12 16:52:21 crc kubenswrapper[5184]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 12 16:52:21 crc kubenswrapper[5184]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 12 16:52:21 crc kubenswrapper[5184]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 12 16:52:21 crc kubenswrapper[5184]: sleep 60 & wait Mar 12 16:52:21 crc kubenswrapper[5184]: continue Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: # Append resolver entries for services Mar 12 16:52:21 crc kubenswrapper[5184]: rc=0 Mar 12 16:52:21 crc kubenswrapper[5184]: for svc in "${!svc_ips[@]}"; do Mar 12 16:52:21 crc kubenswrapper[5184]: for ip in ${svc_ips[${svc}]}; do Mar 12 16:52:21 crc kubenswrapper[5184]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 12 16:52:21 crc kubenswrapper[5184]: done Mar 12 16:52:21 crc kubenswrapper[5184]: done Mar 12 16:52:21 crc kubenswrapper[5184]: if [[ $rc -ne 0 ]]; then Mar 12 16:52:21 crc kubenswrapper[5184]: sleep 60 & wait Mar 12 16:52:21 crc kubenswrapper[5184]: continue Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: Mar 12 16:52:21 crc kubenswrapper[5184]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 12 16:52:21 crc kubenswrapper[5184]: # Replace /etc/hosts with our modified version if needed Mar 12 16:52:21 crc kubenswrapper[5184]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 12 16:52:21 crc kubenswrapper[5184]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 12 16:52:21 crc kubenswrapper[5184]: fi Mar 12 16:52:21 crc kubenswrapper[5184]: sleep 60 & wait Mar 12 16:52:21 crc kubenswrapper[5184]: unset svc_ips Mar 12 16:52:21 crc kubenswrapper[5184]: done Mar 12 16:52:21 crc kubenswrapper[5184]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-dir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k47hx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-ggxxl_openshift-dns(c1239377-fc5d-40f2-b262-0b9c9448a3cf): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:21 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.744212 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.744803 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-ggxxl" podUID="c1239377-fc5d-40f2-b262-0b9c9448a3cf" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.744917 5184 kuberuntime_manager.go:1358] "Unhandled Error" err=< Mar 12 16:52:21 crc kubenswrapper[5184]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 12 16:52:21 crc kubenswrapper[5184]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 12 16:52:21 crc kubenswrapper[5184]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djfvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-99gtj_openshift-multus(542903c2-fc88-4085-979a-db3766958392): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 12 16:52:21 crc kubenswrapper[5184]: > logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.745860 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trmc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-ckfz2_openshift-multus(766663a7-2c04-43da-a76f-dfacc5b1583a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.746340 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-99gtj" podUID="542903c2-fc88-4085-979a-db3766958392" Mar 12 16:52:21 crc kubenswrapper[5184]: E0312 16:52:21.747489 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" podUID="766663a7-2c04-43da-a76f-dfacc5b1583a" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.752758 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.763774 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766663a7-2c04-43da-a76f-dfacc5b1583a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ckfz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.773351 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxc4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.776781 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.776844 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.776870 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.776903 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.776929 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:21Z","lastTransitionTime":"2026-03-12T16:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.802340 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bpj2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.813326 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff799ef2-41aa-4972-ae8f-6e29c01bbd76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d25459743a8e939a6fc0c89681dd4be8f2dbe697494adb6502228b2569ba616f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a1d4060b0813ec05d1dff25751605cd6ce575df8a1a0788b3331780009447967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1b42bc08c18390a5c946037634004c7dcb6cb14b92f56220260d8237aeedd629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.824986 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2beca127-92c3-4737-a680-69e0bf3936a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cb2f9a89582795adf9b1f2e114f29fbcca43ccc0ac07b56136f3c09a99af6c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ca237a33c864a01251750a3a9498ffbf76c02d7c465fec64514b916084eb3a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2bfb723f8c449cda9730d31e02d633c5bc26368677283970a7d7977e8b14823c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.837345 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766f2ece-d155-473b-bc1e-ceca5d270675\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T16:51:55Z\\\",\\\"message\\\":\\\"vvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InOrderInformers\\\\\\\" enabled=true\\\\nW0312 16:51:55.515939 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 16:51:55.516181 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0312 16:51:55.517475 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361147659/tls.crt::/tmp/serving-cert-1361147659/tls.key\\\\\\\"\\\\nI0312 16:51:55.968978 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 16:51:55.972281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 16:51:55.972308 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 16:51:55.972347 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 16:51:55.972358 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 16:51:55.979594 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0312 16:51:55.979642 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0312 16:51:55.979645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979674 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 16:51:55.979702 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 16:51:55.979708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 16:51:55.979715 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0312 16:51:55.981661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.846434 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.854728 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.864076 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.875141 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-99gtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"542903c2-fc88-4085-979a-db3766958392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djfvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-99gtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.878798 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.878861 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.878886 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.878916 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.878941 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:21Z","lastTransitionTime":"2026-03-12T16:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.883023 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-ggxxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1239377-fc5d-40f2-b262-0b9c9448a3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k47hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ggxxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.889085 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c823004-cd7d-4cea-9cdb-b44a806264ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7jw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.896135 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417740d6-e9c9-4fa8-9811-c6704b5b5692\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-wqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.905600 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.915472 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b45c859-3d05-4214-9bd3-2952546f5dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp7pt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.964498 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f335ad31-84ab-4bea-b0f2-75eca434a55d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://7901646b9904fb6a100644a0aacd978a71373a764eea536a29abd51530037c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://850d2c82f02982ba13abc9b9365f5be589329d37001cd14054004a85c6d2e96a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ad74805d602978f7f836e80d83b2ef81f7c4cbbc65155a2268057928abb2f906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://7696ac1bb100db5ea88c53ca29f38064d11d2600b968872de07c222ad6411720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e7c8113246b3ae53a45615d0812c94a8ef10af18f75a421afdf0afa3ecb09223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.981346 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.981409 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.981423 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.981438 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.981449 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:21Z","lastTransitionTime":"2026-03-12T16:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:21 crc kubenswrapper[5184]: I0312 16:52:21.995782 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e576e89-2381-4f76-a33a-bcf82fa79b03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a528e6a13bd818b7d3bb0ef864934913eb0b6b9e7573f8f7840799a03c87c0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.037509 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.056267 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.056472 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.056527 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.056575 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056594 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056659 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056686 5184 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056686 5184 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056807 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:24.056764825 +0000 UTC m=+86.598076204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056856 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:24.056832437 +0000 UTC m=+86.598143816 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056854 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056903 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056927 5184 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056802 5184 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.056990 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:24.056973111 +0000 UTC m=+86.598284490 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.057049 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:24.057024263 +0000 UTC m=+86.598335642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.077502 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b45c859-3d05-4214-9bd3-2952546f5dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp7pt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.083997 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.084047 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.084061 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.084080 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.084091 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:22Z","lastTransitionTime":"2026-03-12T16:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.132863 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f335ad31-84ab-4bea-b0f2-75eca434a55d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://7901646b9904fb6a100644a0aacd978a71373a764eea536a29abd51530037c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://850d2c82f02982ba13abc9b9365f5be589329d37001cd14054004a85c6d2e96a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ad74805d602978f7f836e80d83b2ef81f7c4cbbc65155a2268057928abb2f906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://7696ac1bb100db5ea88c53ca29f38064d11d2600b968872de07c222ad6411720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e7c8113246b3ae53a45615d0812c94a8ef10af18f75a421afdf0afa3ecb09223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.155717 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e576e89-2381-4f76-a33a-bcf82fa79b03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a528e6a13bd818b7d3bb0ef864934913eb0b6b9e7573f8f7840799a03c87c0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.156990 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.157188 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:24.157148049 +0000 UTC m=+86.698459418 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.157272 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.157480 5184 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.157578 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs podName:024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df nodeName:}" failed. No retries permitted until 2026-03-12 16:52:24.157553681 +0000 UTC m=+86.698865060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs") pod "network-metrics-daemon-vxc4c" (UID: "024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.186485 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.186545 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.186564 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.186590 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.186607 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:22Z","lastTransitionTime":"2026-03-12T16:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.198048 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.237235 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.282052 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766663a7-2c04-43da-a76f-dfacc5b1583a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ckfz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.289205 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.289285 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.289312 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.289341 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.289364 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:22Z","lastTransitionTime":"2026-03-12T16:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.317436 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxc4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.361911 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bpj2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.391630 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.391673 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.391685 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.391703 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.391717 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:22Z","lastTransitionTime":"2026-03-12T16:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.397470 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff799ef2-41aa-4972-ae8f-6e29c01bbd76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d25459743a8e939a6fc0c89681dd4be8f2dbe697494adb6502228b2569ba616f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a1d4060b0813ec05d1dff25751605cd6ce575df8a1a0788b3331780009447967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1b42bc08c18390a5c946037634004c7dcb6cb14b92f56220260d8237aeedd629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.399641 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.399769 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.400202 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.400314 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.400455 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.400550 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.400794 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:22 crc kubenswrapper[5184]: E0312 16:52:22.401039 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.403556 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01080b46-74f1-4191-8755-5152a57b3b25" path="/var/lib/kubelet/pods/01080b46-74f1-4191-8755-5152a57b3b25/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.404765 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09cfa50b-4138-4585-a53e-64dd3ab73335" path="/var/lib/kubelet/pods/09cfa50b-4138-4585-a53e-64dd3ab73335/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.407452 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" path="/var/lib/kubelet/pods/0dd0fbac-8c0d-4228-8faa-abbeedabf7db/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.409718 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0effdbcf-dd7d-404d-9d48-77536d665a5d" path="/var/lib/kubelet/pods/0effdbcf-dd7d-404d-9d48-77536d665a5d/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.413214 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149b3c48-e17c-4a66-a835-d86dabf6ff13" path="/var/lib/kubelet/pods/149b3c48-e17c-4a66-a835-d86dabf6ff13/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.417029 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16bdd140-dce1-464c-ab47-dd5798d1d256" path="/var/lib/kubelet/pods/16bdd140-dce1-464c-ab47-dd5798d1d256/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.418553 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f80adb-c1c3-49ba-8ee4-932c851d3897" path="/var/lib/kubelet/pods/18f80adb-c1c3-49ba-8ee4-932c851d3897/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.420900 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" path="/var/lib/kubelet/pods/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.421534 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2325ffef-9d5b-447f-b00e-3efc429acefe" path="/var/lib/kubelet/pods/2325ffef-9d5b-447f-b00e-3efc429acefe/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.422916 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301e1965-1754-483d-b6cc-bfae7038bbca" path="/var/lib/kubelet/pods/301e1965-1754-483d-b6cc-bfae7038bbca/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.424335 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31fa8943-81cc-4750-a0b7-0fa9ab5af883" path="/var/lib/kubelet/pods/31fa8943-81cc-4750-a0b7-0fa9ab5af883/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.426242 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42a11a02-47e1-488f-b270-2679d3298b0e" path="/var/lib/kubelet/pods/42a11a02-47e1-488f-b270-2679d3298b0e/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.426819 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567683bd-0efc-4f21-b076-e28559628404" path="/var/lib/kubelet/pods/567683bd-0efc-4f21-b076-e28559628404/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.429202 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="584e1f4a-8205-47d7-8efb-3afc6017c4c9" path="/var/lib/kubelet/pods/584e1f4a-8205-47d7-8efb-3afc6017c4c9/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.429718 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593a3561-7760-45c5-8f91-5aaef7475d0f" path="/var/lib/kubelet/pods/593a3561-7760-45c5-8f91-5aaef7475d0f/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.430767 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ebfebf6-3ecd-458e-943f-bb25b52e2718" path="/var/lib/kubelet/pods/5ebfebf6-3ecd-458e-943f-bb25b52e2718/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.432447 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6077b63e-53a2-4f96-9d56-1ce0324e4913" path="/var/lib/kubelet/pods/6077b63e-53a2-4f96-9d56-1ce0324e4913/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.435690 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" path="/var/lib/kubelet/pods/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.435686 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2beca127-92c3-4737-a680-69e0bf3936a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cb2f9a89582795adf9b1f2e114f29fbcca43ccc0ac07b56136f3c09a99af6c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ca237a33c864a01251750a3a9498ffbf76c02d7c465fec64514b916084eb3a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2bfb723f8c449cda9730d31e02d633c5bc26368677283970a7d7977e8b14823c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.438583 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6edfcf45-925b-4eff-b940-95b6fc0b85d4" path="/var/lib/kubelet/pods/6edfcf45-925b-4eff-b940-95b6fc0b85d4/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.440665 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee8fbd3-1f81-4666-96da-5afc70819f1a" path="/var/lib/kubelet/pods/6ee8fbd3-1f81-4666-96da-5afc70819f1a/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.442618 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" path="/var/lib/kubelet/pods/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.447249 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736c54fe-349c-4bb9-870a-d1c1d1c03831" path="/var/lib/kubelet/pods/736c54fe-349c-4bb9-870a-d1c1d1c03831/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.450001 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7599e0b6-bddf-4def-b7f2-0b32206e8651" path="/var/lib/kubelet/pods/7599e0b6-bddf-4def-b7f2-0b32206e8651/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.452461 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7afa918d-be67-40a6-803c-d3b0ae99d815" path="/var/lib/kubelet/pods/7afa918d-be67-40a6-803c-d3b0ae99d815/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.455104 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df94c10-441d-4386-93a6-6730fb7bcde0" path="/var/lib/kubelet/pods/7df94c10-441d-4386-93a6-6730fb7bcde0/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.456318 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" path="/var/lib/kubelet/pods/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.458107 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e39f7b-62e4-4fc9-992a-6535ce127a02" path="/var/lib/kubelet/pods/81e39f7b-62e4-4fc9-992a-6535ce127a02/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.459204 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869851b9-7ffb-4af0-b166-1d8aa40a5f80" path="/var/lib/kubelet/pods/869851b9-7ffb-4af0-b166-1d8aa40a5f80/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.461676 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" path="/var/lib/kubelet/pods/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.462615 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92dfbade-90b6-4169-8c07-72cff7f2c82b" path="/var/lib/kubelet/pods/92dfbade-90b6-4169-8c07-72cff7f2c82b/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.464454 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a6e063-3d1a-4d44-875d-185291448c31" path="/var/lib/kubelet/pods/94a6e063-3d1a-4d44-875d-185291448c31/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.465658 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f71a554-e414-4bc3-96d2-674060397afe" path="/var/lib/kubelet/pods/9f71a554-e414-4bc3-96d2-674060397afe/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.469054 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a208c9c2-333b-4b4a-be0d-bc32ec38a821" path="/var/lib/kubelet/pods/a208c9c2-333b-4b4a-be0d-bc32ec38a821/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.471206 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" path="/var/lib/kubelet/pods/a52afe44-fb37-46ed-a1f8-bf39727a3cbe/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.472355 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a555ff2e-0be6-46d5-897d-863bb92ae2b3" path="/var/lib/kubelet/pods/a555ff2e-0be6-46d5-897d-863bb92ae2b3/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.473275 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a88189-c967-4640-879e-27665747f20c" path="/var/lib/kubelet/pods/a7a88189-c967-4640-879e-27665747f20c/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.474422 5184 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="af33e427-6803-48c2-a76a-dd9deb7cbf9a" path="/var/lib/kubelet/pods/af33e427-6803-48c2-a76a-dd9deb7cbf9a/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.474523 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af33e427-6803-48c2-a76a-dd9deb7cbf9a" path="/var/lib/kubelet/pods/af33e427-6803-48c2-a76a-dd9deb7cbf9a/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.478765 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af41de71-79cf-4590-bbe9-9e8b848862cb" path="/var/lib/kubelet/pods/af41de71-79cf-4590-bbe9-9e8b848862cb/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.480675 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" path="/var/lib/kubelet/pods/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.481511 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766f2ece-d155-473b-bc1e-ceca5d270675\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T16:51:55Z\\\",\\\"message\\\":\\\"vvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InOrderInformers\\\\\\\" enabled=true\\\\nW0312 16:51:55.515939 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 16:51:55.516181 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0312 16:51:55.517475 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361147659/tls.crt::/tmp/serving-cert-1361147659/tls.key\\\\\\\"\\\\nI0312 16:51:55.968978 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 16:51:55.972281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 16:51:55.972308 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 16:51:55.972347 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 16:51:55.972358 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 16:51:55.979594 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0312 16:51:55.979642 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0312 16:51:55.979645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979674 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 16:51:55.979702 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 16:51:55.979708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 16:51:55.979715 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0312 16:51:55.981661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.482405 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4750666-1362-4001-abd0-6f89964cc621" path="/var/lib/kubelet/pods/b4750666-1362-4001-abd0-6f89964cc621/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.483506 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b605f283-6f2e-42da-a838-54421690f7d0" path="/var/lib/kubelet/pods/b605f283-6f2e-42da-a838-54421690f7d0/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.484335 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c491984c-7d4b-44aa-8c1e-d7974424fa47" path="/var/lib/kubelet/pods/c491984c-7d4b-44aa-8c1e-d7974424fa47/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.485521 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f2bfad-70f6-4185-a3d9-81ce12720767" path="/var/lib/kubelet/pods/c5f2bfad-70f6-4185-a3d9-81ce12720767/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.486617 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc85e424-18b2-4924-920b-bd291a8c4b01" path="/var/lib/kubelet/pods/cc85e424-18b2-4924-920b-bd291a8c4b01/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.487082 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce090a97-9ab6-4c40-a719-64ff2acd9778" path="/var/lib/kubelet/pods/ce090a97-9ab6-4c40-a719-64ff2acd9778/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.488073 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19cb085-0c5b-4810-b654-ce7923221d90" path="/var/lib/kubelet/pods/d19cb085-0c5b-4810-b654-ce7923221d90/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.489971 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" path="/var/lib/kubelet/pods/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.491271 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d565531a-ff86-4608-9d19-767de01ac31b" path="/var/lib/kubelet/pods/d565531a-ff86-4608-9d19-767de01ac31b/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.492396 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e8f42f-dc0e-424b-bb56-5ec849834888" path="/var/lib/kubelet/pods/d7e8f42f-dc0e-424b-bb56-5ec849834888/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.493345 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" path="/var/lib/kubelet/pods/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.493707 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.493752 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.493763 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.493780 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.493789 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:22Z","lastTransitionTime":"2026-03-12T16:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.494653 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e093be35-bb62-4843-b2e8-094545761610" path="/var/lib/kubelet/pods/e093be35-bb62-4843-b2e8-094545761610/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.495500 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" path="/var/lib/kubelet/pods/e1d2a42d-af1d-4054-9618-ab545e0ed8b7/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.496777 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f559dfa3-3917-43a2-97f6-61ddfda10e93" path="/var/lib/kubelet/pods/f559dfa3-3917-43a2-97f6-61ddfda10e93/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.498855 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65c0ac1-8bca-454d-a2e6-e35cb418beac" path="/var/lib/kubelet/pods/f65c0ac1-8bca-454d-a2e6-e35cb418beac/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.499911 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" path="/var/lib/kubelet/pods/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.507137 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e2c886-118e-43bb-bef1-c78134de392b" path="/var/lib/kubelet/pods/f7e2c886-118e-43bb-bef1-c78134de392b/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.510278 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" path="/var/lib/kubelet/pods/fc8db2c7-859d-47b3-a900-2bd0c0b2973b/volumes" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.517799 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.554216 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.596573 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.596646 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.596668 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.596696 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.596716 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:22Z","lastTransitionTime":"2026-03-12T16:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.600456 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.638323 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-99gtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"542903c2-fc88-4085-979a-db3766958392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djfvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-99gtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.675902 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-ggxxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1239377-fc5d-40f2-b262-0b9c9448a3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k47hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ggxxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.699141 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.699204 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.699222 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.699245 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.699264 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:22Z","lastTransitionTime":"2026-03-12T16:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.716257 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c823004-cd7d-4cea-9cdb-b44a806264ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7jw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.756621 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417740d6-e9c9-4fa8-9811-c6704b5b5692\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-wqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.801202 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.801257 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.801276 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.801300 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.801317 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:22Z","lastTransitionTime":"2026-03-12T16:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.903847 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.903902 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.903922 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.903956 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:22 crc kubenswrapper[5184]: I0312 16:52:22.903996 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:22Z","lastTransitionTime":"2026-03-12T16:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.006334 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.006499 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.006566 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.006589 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.006646 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:23Z","lastTransitionTime":"2026-03-12T16:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.108744 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.108806 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.108823 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.108846 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.108863 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:23Z","lastTransitionTime":"2026-03-12T16:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.211642 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.212171 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.212205 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.212239 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.212263 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:23Z","lastTransitionTime":"2026-03-12T16:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.320252 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.320323 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.320352 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.320414 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.320439 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:23Z","lastTransitionTime":"2026-03-12T16:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.422442 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.422533 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.422567 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.422596 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.422617 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:23Z","lastTransitionTime":"2026-03-12T16:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.524791 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.524896 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.524935 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.524956 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.524973 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:23Z","lastTransitionTime":"2026-03-12T16:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.627601 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.627697 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.627725 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.627752 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.627772 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:23Z","lastTransitionTime":"2026-03-12T16:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.730807 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.730871 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.730891 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.730917 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.730935 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:23Z","lastTransitionTime":"2026-03-12T16:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.833561 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.833616 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.833631 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.833646 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.833656 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:23Z","lastTransitionTime":"2026-03-12T16:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.935834 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.935901 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.935920 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.935945 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:23 crc kubenswrapper[5184]: I0312 16:52:23.935963 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:23Z","lastTransitionTime":"2026-03-12T16:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.038709 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.038975 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.039110 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.039263 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.039424 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:24Z","lastTransitionTime":"2026-03-12T16:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.079741 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.079797 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.079829 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.079860 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.079951 5184 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.079965 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.079981 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.079992 5184 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.080017 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:28.080001226 +0000 UTC m=+90.621312565 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.080032 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:28.080026977 +0000 UTC m=+90.621338316 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.080053 5184 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.080184 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:28.080153191 +0000 UTC m=+90.621464580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.080902 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.081068 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.081185 5184 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.081405 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:28.081349278 +0000 UTC m=+90.622660647 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.141678 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.141954 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.142078 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.142226 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.142358 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:24Z","lastTransitionTime":"2026-03-12T16:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.180998 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.181090 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.181218 5184 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.181234 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:28.181182805 +0000 UTC m=+90.722494184 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.181290 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs podName:024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df nodeName:}" failed. No retries permitted until 2026-03-12 16:52:28.181271697 +0000 UTC m=+90.722583106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs") pod "network-metrics-daemon-vxc4c" (UID: "024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.244828 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.244888 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.244909 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.244935 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.244953 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:24Z","lastTransitionTime":"2026-03-12T16:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.347659 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.348006 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.348209 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.348426 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.348599 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:24Z","lastTransitionTime":"2026-03-12T16:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.399702 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.399702 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.399858 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.399887 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.399977 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.400179 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.400208 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:24 crc kubenswrapper[5184]: E0312 16:52:24.400445 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.451157 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.451498 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.451665 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.451812 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.451929 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:24Z","lastTransitionTime":"2026-03-12T16:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.554334 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.554641 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.554819 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.554979 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.555104 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:24Z","lastTransitionTime":"2026-03-12T16:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.657251 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.657566 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.657718 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.657861 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.657998 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:24Z","lastTransitionTime":"2026-03-12T16:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.760636 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.760681 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.760695 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.760710 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.760719 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:24Z","lastTransitionTime":"2026-03-12T16:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.863016 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.863108 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.863136 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.863211 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.863338 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:24Z","lastTransitionTime":"2026-03-12T16:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.966018 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.966087 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.966103 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.966122 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:24 crc kubenswrapper[5184]: I0312 16:52:24.966136 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:24Z","lastTransitionTime":"2026-03-12T16:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.069058 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.069104 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.069113 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.069144 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.069155 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:25Z","lastTransitionTime":"2026-03-12T16:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.171783 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.171845 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.171858 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.171878 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.171890 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:25Z","lastTransitionTime":"2026-03-12T16:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.275272 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.275332 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.275350 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.275399 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.275418 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:25Z","lastTransitionTime":"2026-03-12T16:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.377533 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.377593 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.377612 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.377636 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.377653 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:25Z","lastTransitionTime":"2026-03-12T16:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.480009 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.480131 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.480159 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.480184 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.480201 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:25Z","lastTransitionTime":"2026-03-12T16:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.582847 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.582913 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.582932 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.582959 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.582999 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:25Z","lastTransitionTime":"2026-03-12T16:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.686031 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.686099 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.686123 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.686148 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.686167 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:25Z","lastTransitionTime":"2026-03-12T16:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.788689 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.788738 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.788748 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.788765 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.788776 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:25Z","lastTransitionTime":"2026-03-12T16:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.891824 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.891886 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.891913 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.891936 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.891955 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:25Z","lastTransitionTime":"2026-03-12T16:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.994170 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.994238 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.994260 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.994290 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:25 crc kubenswrapper[5184]: I0312 16:52:25.994312 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:25Z","lastTransitionTime":"2026-03-12T16:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.097890 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.097997 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.098015 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.098077 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.098101 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:26Z","lastTransitionTime":"2026-03-12T16:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.200471 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.200516 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.200526 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.200541 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.200550 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:26Z","lastTransitionTime":"2026-03-12T16:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.302887 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.302953 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.302973 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.302997 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.303015 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:26Z","lastTransitionTime":"2026-03-12T16:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.399788 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.399826 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.399998 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:26 crc kubenswrapper[5184]: E0312 16:52:26.400000 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.400014 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:26 crc kubenswrapper[5184]: E0312 16:52:26.400257 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:26 crc kubenswrapper[5184]: E0312 16:52:26.400426 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:26 crc kubenswrapper[5184]: E0312 16:52:26.400553 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.405203 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.405233 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.405245 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.405261 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.405273 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:26Z","lastTransitionTime":"2026-03-12T16:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.507941 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.508035 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.508064 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.508098 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.508122 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:26Z","lastTransitionTime":"2026-03-12T16:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.541261 5184 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.610326 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.610433 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.610460 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.610490 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.610512 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:26Z","lastTransitionTime":"2026-03-12T16:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.720368 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.720477 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.720504 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.720536 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.720559 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:26Z","lastTransitionTime":"2026-03-12T16:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.822622 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.822667 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.822677 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.822691 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.822701 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:26Z","lastTransitionTime":"2026-03-12T16:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.925520 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.925570 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.925584 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.925601 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:26 crc kubenswrapper[5184]: I0312 16:52:26.925616 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:26Z","lastTransitionTime":"2026-03-12T16:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.027555 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.027621 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.027633 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.027653 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.027668 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:27Z","lastTransitionTime":"2026-03-12T16:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.129794 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.129849 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.129863 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.129883 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.129897 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:27Z","lastTransitionTime":"2026-03-12T16:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.232913 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.233000 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.233029 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.233061 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.233087 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:27Z","lastTransitionTime":"2026-03-12T16:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.335691 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.335737 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.335745 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.335758 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.335768 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:27Z","lastTransitionTime":"2026-03-12T16:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.437982 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.438040 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.438052 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.438068 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.438082 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:27Z","lastTransitionTime":"2026-03-12T16:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.540299 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.540343 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.540357 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.540389 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.540403 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:27Z","lastTransitionTime":"2026-03-12T16:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.643030 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.643104 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.643130 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.643159 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.643183 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:27Z","lastTransitionTime":"2026-03-12T16:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.745555 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.745594 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.745606 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.745621 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.745633 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:27Z","lastTransitionTime":"2026-03-12T16:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.847989 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.849166 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.849414 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.849653 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.849864 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:27Z","lastTransitionTime":"2026-03-12T16:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.952244 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.952310 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.952329 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.952355 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:27 crc kubenswrapper[5184]: I0312 16:52:27.952422 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:27Z","lastTransitionTime":"2026-03-12T16:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.055215 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.055648 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.055788 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.055947 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.056107 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.128225 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.128299 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128402 5184 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128476 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:36.128457451 +0000 UTC m=+98.669768790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.128511 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.128546 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128601 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128621 5184 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128638 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128656 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:36.128646707 +0000 UTC m=+98.669958046 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128660 5184 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128655 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128738 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:36.128712419 +0000 UTC m=+98.670023798 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128800 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128837 5184 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.128959 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:36.128926695 +0000 UTC m=+98.670238064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.158021 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.158066 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.158078 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.158096 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.158110 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.229211 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.229302 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:36.229279078 +0000 UTC m=+98.770590427 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.229344 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.229487 5184 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.229540 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs podName:024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df nodeName:}" failed. No retries permitted until 2026-03-12 16:52:36.229528646 +0000 UTC m=+98.770839985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs") pod "network-metrics-daemon-vxc4c" (UID: "024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.259981 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.260050 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.260072 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.260103 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.260126 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.261131 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.261192 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.261216 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.261244 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.261265 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.274408 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.278078 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.278445 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.278502 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.278536 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.278557 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.290862 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.294022 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.294100 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.294128 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.294152 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.294170 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.304482 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.308155 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.308210 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.308230 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.308252 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.308270 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.320701 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.324751 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.324842 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.324865 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.324889 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.324907 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.342229 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400460Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861260Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3ecf36dd-ef58-4e82-ba73-5f8a9b3572a1\\\",\\\"systemUUID\\\":\\\"50e372b3-53c9-4d5a-992b-af3198b0aed7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.342572 5184 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.363016 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.363080 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.363099 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.363120 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.363137 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.399085 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.399092 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.399314 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.399314 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.399527 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.399626 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.399782 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:28 crc kubenswrapper[5184]: E0312 16:52:28.399931 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.415858 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766663a7-2c04-43da-a76f-dfacc5b1583a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ckfz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.424849 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxc4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.440257 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bpj2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.454473 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff799ef2-41aa-4972-ae8f-6e29c01bbd76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d25459743a8e939a6fc0c89681dd4be8f2dbe697494adb6502228b2569ba616f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a1d4060b0813ec05d1dff25751605cd6ce575df8a1a0788b3331780009447967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1b42bc08c18390a5c946037634004c7dcb6cb14b92f56220260d8237aeedd629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.464898 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.464947 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.464964 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.464986 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.465003 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.470799 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2beca127-92c3-4737-a680-69e0bf3936a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cb2f9a89582795adf9b1f2e114f29fbcca43ccc0ac07b56136f3c09a99af6c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ca237a33c864a01251750a3a9498ffbf76c02d7c465fec64514b916084eb3a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2bfb723f8c449cda9730d31e02d633c5bc26368677283970a7d7977e8b14823c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.492750 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766f2ece-d155-473b-bc1e-ceca5d270675\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T16:51:55Z\\\",\\\"message\\\":\\\"vvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InOrderInformers\\\\\\\" enabled=true\\\\nW0312 16:51:55.515939 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 16:51:55.516181 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0312 16:51:55.517475 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361147659/tls.crt::/tmp/serving-cert-1361147659/tls.key\\\\\\\"\\\\nI0312 16:51:55.968978 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 16:51:55.972281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 16:51:55.972308 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 16:51:55.972347 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 16:51:55.972358 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 16:51:55.979594 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0312 16:51:55.979642 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0312 16:51:55.979645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979674 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 16:51:55.979702 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 16:51:55.979708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 16:51:55.979715 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0312 16:51:55.981661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.507164 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.519588 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.530464 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.545041 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-99gtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"542903c2-fc88-4085-979a-db3766958392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djfvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-99gtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.558489 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-ggxxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1239377-fc5d-40f2-b262-0b9c9448a3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k47hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ggxxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.567543 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.567596 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.567607 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.567622 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.567631 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.567872 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c823004-cd7d-4cea-9cdb-b44a806264ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7jw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.579058 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417740d6-e9c9-4fa8-9811-c6704b5b5692\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-wqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.588775 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.599067 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b45c859-3d05-4214-9bd3-2952546f5dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp7pt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.626254 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f335ad31-84ab-4bea-b0f2-75eca434a55d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://7901646b9904fb6a100644a0aacd978a71373a764eea536a29abd51530037c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://850d2c82f02982ba13abc9b9365f5be589329d37001cd14054004a85c6d2e96a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ad74805d602978f7f836e80d83b2ef81f7c4cbbc65155a2268057928abb2f906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://7696ac1bb100db5ea88c53ca29f38064d11d2600b968872de07c222ad6411720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e7c8113246b3ae53a45615d0812c94a8ef10af18f75a421afdf0afa3ecb09223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.636784 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e576e89-2381-4f76-a33a-bcf82fa79b03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a528e6a13bd818b7d3bb0ef864934913eb0b6b9e7573f8f7840799a03c87c0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.651571 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.666792 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.669052 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.669108 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.669127 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.669151 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.669172 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.771467 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.771564 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.771620 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.771646 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.771663 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.873903 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.873969 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.873990 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.874062 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.874100 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.976869 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.976929 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.977827 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.977897 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:28 crc kubenswrapper[5184]: I0312 16:52:28.977932 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:28Z","lastTransitionTime":"2026-03-12T16:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.080241 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.080328 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.080352 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.080415 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.080485 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:29Z","lastTransitionTime":"2026-03-12T16:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.143064 5184 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.182516 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.182596 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.182624 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.182655 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.182679 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:29Z","lastTransitionTime":"2026-03-12T16:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.285683 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.285735 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.285752 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.285776 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.285794 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:29Z","lastTransitionTime":"2026-03-12T16:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.388904 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.389013 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.389037 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.389067 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.389088 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:29Z","lastTransitionTime":"2026-03-12T16:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.491849 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.491904 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.491931 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.491946 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.491954 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:29Z","lastTransitionTime":"2026-03-12T16:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.594284 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.594367 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.594432 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.594466 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.594490 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:29Z","lastTransitionTime":"2026-03-12T16:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.696669 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.696748 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.696774 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.696824 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.696851 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:29Z","lastTransitionTime":"2026-03-12T16:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.799370 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.799474 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.799503 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.799565 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.799592 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:29Z","lastTransitionTime":"2026-03-12T16:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.902582 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.902633 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.902647 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.902665 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:29 crc kubenswrapper[5184]: I0312 16:52:29.902683 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:29Z","lastTransitionTime":"2026-03-12T16:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.005279 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.005355 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.005403 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.005434 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.005490 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:30Z","lastTransitionTime":"2026-03-12T16:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.108160 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.108212 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.108224 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.108241 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.108254 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:30Z","lastTransitionTime":"2026-03-12T16:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.211072 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.211180 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.211201 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.211225 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.211242 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:30Z","lastTransitionTime":"2026-03-12T16:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.313291 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.313333 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.313344 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.313356 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.313366 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:30Z","lastTransitionTime":"2026-03-12T16:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.398926 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:30 crc kubenswrapper[5184]: E0312 16:52:30.399145 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.399169 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.399207 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:30 crc kubenswrapper[5184]: E0312 16:52:30.399310 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:30 crc kubenswrapper[5184]: E0312 16:52:30.399442 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.399497 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:30 crc kubenswrapper[5184]: E0312 16:52:30.399581 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.415757 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.415834 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.415854 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.415880 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.415901 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:30Z","lastTransitionTime":"2026-03-12T16:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.517877 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.517945 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.517958 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.517975 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.517986 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:30Z","lastTransitionTime":"2026-03-12T16:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.620902 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.620973 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.620993 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.621029 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.621050 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:30Z","lastTransitionTime":"2026-03-12T16:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.723108 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.723154 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.723163 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.723179 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.723188 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:30Z","lastTransitionTime":"2026-03-12T16:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.825249 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.825315 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.825332 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.825356 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.825409 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:30Z","lastTransitionTime":"2026-03-12T16:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.928294 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.928366 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.928413 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.928439 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:30 crc kubenswrapper[5184]: I0312 16:52:30.928458 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:30Z","lastTransitionTime":"2026-03-12T16:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.031118 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.031196 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.031225 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.031256 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.031280 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:31Z","lastTransitionTime":"2026-03-12T16:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.133324 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.133397 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.133413 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.133432 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.133447 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:31Z","lastTransitionTime":"2026-03-12T16:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.235608 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.235677 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.235696 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.235721 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.235739 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:31Z","lastTransitionTime":"2026-03-12T16:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.339157 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.339212 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.339225 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.339241 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.339255 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:31Z","lastTransitionTime":"2026-03-12T16:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.441228 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.441302 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.441322 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.441347 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.441404 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:31Z","lastTransitionTime":"2026-03-12T16:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.522687 5184 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.544360 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.544450 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.544468 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.544491 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.544510 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:31Z","lastTransitionTime":"2026-03-12T16:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.646853 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.646937 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.646955 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.646982 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.647006 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:31Z","lastTransitionTime":"2026-03-12T16:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.749158 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.749209 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.749222 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.749239 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.749253 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:31Z","lastTransitionTime":"2026-03-12T16:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.851702 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.851759 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.851772 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.851790 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.851802 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:31Z","lastTransitionTime":"2026-03-12T16:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.954570 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.954621 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.954634 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.954654 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:31 crc kubenswrapper[5184]: I0312 16:52:31.954666 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:31Z","lastTransitionTime":"2026-03-12T16:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.057068 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.057156 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.057178 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.057216 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.057251 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:32Z","lastTransitionTime":"2026-03-12T16:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.160218 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.160261 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.160272 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.160286 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.160296 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:32Z","lastTransitionTime":"2026-03-12T16:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.263484 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.263588 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.263608 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.263633 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.263651 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:32Z","lastTransitionTime":"2026-03-12T16:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.366559 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.366691 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.366767 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.366797 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.366814 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:32Z","lastTransitionTime":"2026-03-12T16:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.399740 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.399746 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.399977 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:32 crc kubenswrapper[5184]: E0312 16:52:32.399976 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:32 crc kubenswrapper[5184]: E0312 16:52:32.400177 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:32 crc kubenswrapper[5184]: E0312 16:52:32.400268 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:32 crc kubenswrapper[5184]: I0312 16:52:32.400349 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:32 crc kubenswrapper[5184]: E0312 16:52:32.400527 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.211294 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.211367 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.211427 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.211462 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.211486 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:33Z","lastTransitionTime":"2026-03-12T16:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.313065 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.313159 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.313172 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.313190 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.313203 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:33Z","lastTransitionTime":"2026-03-12T16:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.415029 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.415114 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.415128 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.415146 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.415158 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:33Z","lastTransitionTime":"2026-03-12T16:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.520335 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.520400 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.520421 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.520439 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.520454 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:33Z","lastTransitionTime":"2026-03-12T16:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.622338 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.622794 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.622808 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.622825 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.622837 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:33Z","lastTransitionTime":"2026-03-12T16:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.725086 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.725129 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.725139 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.725153 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.725164 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:33Z","lastTransitionTime":"2026-03-12T16:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.827037 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.827092 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.827104 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.827121 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.827134 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:33Z","lastTransitionTime":"2026-03-12T16:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.929735 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.929786 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.929802 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.929818 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:33 crc kubenswrapper[5184]: I0312 16:52:33.929831 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:33Z","lastTransitionTime":"2026-03-12T16:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.032562 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.032611 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.032623 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.032642 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.032654 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:34Z","lastTransitionTime":"2026-03-12T16:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.135258 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.135298 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.135307 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.135319 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.135329 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:34Z","lastTransitionTime":"2026-03-12T16:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.216777 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ggxxl" event={"ID":"c1239377-fc5d-40f2-b262-0b9c9448a3cf","Type":"ContainerStarted","Data":"6fa3c07e34957365215009c4295fb02cb341addedfb591a62a97ffaa6d550076"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.218668 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" event={"ID":"417740d6-e9c9-4fa8-9811-c6704b5b5692","Type":"ContainerStarted","Data":"26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.218725 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" event={"ID":"417740d6-e9c9-4fa8-9811-c6704b5b5692","Type":"ContainerStarted","Data":"0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.232526 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.241439 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.241612 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.241925 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.241989 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.242008 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:34Z","lastTransitionTime":"2026-03-12T16:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.243019 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b45c859-3d05-4214-9bd3-2952546f5dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp7pt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.260689 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f335ad31-84ab-4bea-b0f2-75eca434a55d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://7901646b9904fb6a100644a0aacd978a71373a764eea536a29abd51530037c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://850d2c82f02982ba13abc9b9365f5be589329d37001cd14054004a85c6d2e96a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ad74805d602978f7f836e80d83b2ef81f7c4cbbc65155a2268057928abb2f906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://7696ac1bb100db5ea88c53ca29f38064d11d2600b968872de07c222ad6411720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e7c8113246b3ae53a45615d0812c94a8ef10af18f75a421afdf0afa3ecb09223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.268492 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e576e89-2381-4f76-a33a-bcf82fa79b03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a528e6a13bd818b7d3bb0ef864934913eb0b6b9e7573f8f7840799a03c87c0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.277006 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.287662 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.300354 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766663a7-2c04-43da-a76f-dfacc5b1583a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ckfz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.308520 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxc4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.326999 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bpj2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.339170 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff799ef2-41aa-4972-ae8f-6e29c01bbd76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d25459743a8e939a6fc0c89681dd4be8f2dbe697494adb6502228b2569ba616f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a1d4060b0813ec05d1dff25751605cd6ce575df8a1a0788b3331780009447967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1b42bc08c18390a5c946037634004c7dcb6cb14b92f56220260d8237aeedd629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.345945 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.346045 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.346058 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.346073 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.346101 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:34Z","lastTransitionTime":"2026-03-12T16:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.352620 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2beca127-92c3-4737-a680-69e0bf3936a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cb2f9a89582795adf9b1f2e114f29fbcca43ccc0ac07b56136f3c09a99af6c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ca237a33c864a01251750a3a9498ffbf76c02d7c465fec64514b916084eb3a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2bfb723f8c449cda9730d31e02d633c5bc26368677283970a7d7977e8b14823c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.370821 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766f2ece-d155-473b-bc1e-ceca5d270675\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T16:51:55Z\\\",\\\"message\\\":\\\"vvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InOrderInformers\\\\\\\" enabled=true\\\\nW0312 16:51:55.515939 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 16:51:55.516181 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0312 16:51:55.517475 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361147659/tls.crt::/tmp/serving-cert-1361147659/tls.key\\\\\\\"\\\\nI0312 16:51:55.968978 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 16:51:55.972281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 16:51:55.972308 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 16:51:55.972347 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 16:51:55.972358 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 16:51:55.979594 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0312 16:51:55.979642 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0312 16:51:55.979645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979674 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 16:51:55.979702 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 16:51:55.979708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 16:51:55.979715 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0312 16:51:55.981661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.382247 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.396632 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.399009 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.399058 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.399638 5184 scope.go:117] "RemoveContainer" containerID="c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4" Mar 12 16:52:34 crc kubenswrapper[5184]: E0312 16:52:34.399852 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.400009 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:34 crc kubenswrapper[5184]: E0312 16:52:34.400095 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.400132 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:34 crc kubenswrapper[5184]: E0312 16:52:34.400193 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:34 crc kubenswrapper[5184]: E0312 16:52:34.400343 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:34 crc kubenswrapper[5184]: E0312 16:52:34.400636 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.412472 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.426299 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-99gtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"542903c2-fc88-4085-979a-db3766958392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djfvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-99gtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.444833 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-ggxxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1239377-fc5d-40f2-b262-0b9c9448a3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"},\\\"containerID\\\":\\\"cri-o://6fa3c07e34957365215009c4295fb02cb341addedfb591a62a97ffaa6d550076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:33Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k47hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ggxxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.451563 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.451599 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.451613 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.451634 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.451649 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:34Z","lastTransitionTime":"2026-03-12T16:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.457584 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c823004-cd7d-4cea-9cdb-b44a806264ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7jw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.469772 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417740d6-e9c9-4fa8-9811-c6704b5b5692\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-wqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.481301 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b45c859-3d05-4214-9bd3-2952546f5dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp7pt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.508966 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f335ad31-84ab-4bea-b0f2-75eca434a55d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://7901646b9904fb6a100644a0aacd978a71373a764eea536a29abd51530037c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://850d2c82f02982ba13abc9b9365f5be589329d37001cd14054004a85c6d2e96a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ad74805d602978f7f836e80d83b2ef81f7c4cbbc65155a2268057928abb2f906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://7696ac1bb100db5ea88c53ca29f38064d11d2600b968872de07c222ad6411720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e7c8113246b3ae53a45615d0812c94a8ef10af18f75a421afdf0afa3ecb09223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.516440 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e576e89-2381-4f76-a33a-bcf82fa79b03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a528e6a13bd818b7d3bb0ef864934913eb0b6b9e7573f8f7840799a03c87c0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.527476 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.537306 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.548782 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766663a7-2c04-43da-a76f-dfacc5b1583a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ckfz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.553080 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.553121 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.553135 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.553156 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.553169 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:34Z","lastTransitionTime":"2026-03-12T16:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.556268 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxc4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.570149 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bpj2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.578546 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff799ef2-41aa-4972-ae8f-6e29c01bbd76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d25459743a8e939a6fc0c89681dd4be8f2dbe697494adb6502228b2569ba616f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a1d4060b0813ec05d1dff25751605cd6ce575df8a1a0788b3331780009447967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1b42bc08c18390a5c946037634004c7dcb6cb14b92f56220260d8237aeedd629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.611956 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2beca127-92c3-4737-a680-69e0bf3936a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cb2f9a89582795adf9b1f2e114f29fbcca43ccc0ac07b56136f3c09a99af6c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ca237a33c864a01251750a3a9498ffbf76c02d7c465fec64514b916084eb3a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2bfb723f8c449cda9730d31e02d633c5bc26368677283970a7d7977e8b14823c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.644541 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766f2ece-d155-473b-bc1e-ceca5d270675\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T16:51:55Z\\\",\\\"message\\\":\\\"vvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InOrderInformers\\\\\\\" enabled=true\\\\nW0312 16:51:55.515939 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 16:51:55.516181 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0312 16:51:55.517475 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361147659/tls.crt::/tmp/serving-cert-1361147659/tls.key\\\\\\\"\\\\nI0312 16:51:55.968978 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 16:51:55.972281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 16:51:55.972308 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 16:51:55.972347 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 16:51:55.972358 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 16:51:55.979594 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0312 16:51:55.979642 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0312 16:51:55.979645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979674 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 16:51:55.979702 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 16:51:55.979708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 16:51:55.979715 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0312 16:51:55.981661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.653585 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.655056 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.655089 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.655099 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.655112 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.655136 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:34Z","lastTransitionTime":"2026-03-12T16:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.661828 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.671472 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.679696 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-99gtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"542903c2-fc88-4085-979a-db3766958392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djfvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-99gtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.686367 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-ggxxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1239377-fc5d-40f2-b262-0b9c9448a3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"},\\\"containerID\\\":\\\"cri-o://6fa3c07e34957365215009c4295fb02cb341addedfb591a62a97ffaa6d550076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:33Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k47hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ggxxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.692872 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c823004-cd7d-4cea-9cdb-b44a806264ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7jw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.700652 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417740d6-e9c9-4fa8-9811-c6704b5b5692\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"},\\\"containerID\\\":\\\"cri-o://0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:33Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"},\\\"containerID\\\":\\\"cri-o://26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:33Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-wqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.708892 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.757000 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.757036 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.757046 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.757059 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.757068 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:34Z","lastTransitionTime":"2026-03-12T16:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.859827 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.859869 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.859881 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.859905 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.859917 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:34Z","lastTransitionTime":"2026-03-12T16:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.962835 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.962879 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.962893 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.962909 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:34 crc kubenswrapper[5184]: I0312 16:52:34.962918 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:34Z","lastTransitionTime":"2026-03-12T16:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.065500 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.065543 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.065556 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.065573 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.065585 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:35Z","lastTransitionTime":"2026-03-12T16:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.168323 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.168426 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.168446 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.168472 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.168489 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:35Z","lastTransitionTime":"2026-03-12T16:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.223659 5184 generic.go:358] "Generic (PLEG): container finished" podID="766663a7-2c04-43da-a76f-dfacc5b1583a" containerID="e4a51f5c81c52e6ebac9714cf00223e92acc2c4808c05d751bcf2475c3abced8" exitCode=0 Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.223732 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" event={"ID":"766663a7-2c04-43da-a76f-dfacc5b1583a","Type":"ContainerDied","Data":"e4a51f5c81c52e6ebac9714cf00223e92acc2c4808c05d751bcf2475c3abced8"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.225514 5184 generic.go:358] "Generic (PLEG): container finished" podID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerID="a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551" exitCode=0 Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.225611 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerDied","Data":"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.228634 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tnk2c" event={"ID":"9c823004-cd7d-4cea-9cdb-b44a806264ab","Type":"ContainerStarted","Data":"9563eb132af8f5414408d6752dc83e1c23365df4d88b94a4170577e484fdcc9f"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.235263 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.243809 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.263181 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766663a7-2c04-43da-a76f-dfacc5b1583a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4a51f5c81c52e6ebac9714cf00223e92acc2c4808c05d751bcf2475c3abced8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4a51f5c81c52e6ebac9714cf00223e92acc2c4808c05d751bcf2475c3abced8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:52:34Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ckfz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.271742 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.271790 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.271809 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.271832 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.271850 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:35Z","lastTransitionTime":"2026-03-12T16:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.275460 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxc4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.291134 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bpj2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.303985 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff799ef2-41aa-4972-ae8f-6e29c01bbd76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d25459743a8e939a6fc0c89681dd4be8f2dbe697494adb6502228b2569ba616f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a1d4060b0813ec05d1dff25751605cd6ce575df8a1a0788b3331780009447967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1b42bc08c18390a5c946037634004c7dcb6cb14b92f56220260d8237aeedd629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.313682 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2beca127-92c3-4737-a680-69e0bf3936a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cb2f9a89582795adf9b1f2e114f29fbcca43ccc0ac07b56136f3c09a99af6c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ca237a33c864a01251750a3a9498ffbf76c02d7c465fec64514b916084eb3a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2bfb723f8c449cda9730d31e02d633c5bc26368677283970a7d7977e8b14823c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.332234 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766f2ece-d155-473b-bc1e-ceca5d270675\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T16:51:55Z\\\",\\\"message\\\":\\\"vvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InOrderInformers\\\\\\\" enabled=true\\\\nW0312 16:51:55.515939 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 16:51:55.516181 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0312 16:51:55.517475 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361147659/tls.crt::/tmp/serving-cert-1361147659/tls.key\\\\\\\"\\\\nI0312 16:51:55.968978 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 16:51:55.972281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 16:51:55.972308 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 16:51:55.972347 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 16:51:55.972358 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 16:51:55.979594 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0312 16:51:55.979642 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0312 16:51:55.979645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979674 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 16:51:55.979702 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 16:51:55.979708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 16:51:55.979715 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0312 16:51:55.981661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.343855 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.356087 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.367555 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.376437 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.376491 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.376508 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.376529 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.376547 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:35Z","lastTransitionTime":"2026-03-12T16:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.382796 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-99gtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"542903c2-fc88-4085-979a-db3766958392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djfvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-99gtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.390228 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-ggxxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1239377-fc5d-40f2-b262-0b9c9448a3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"},\\\"containerID\\\":\\\"cri-o://6fa3c07e34957365215009c4295fb02cb341addedfb591a62a97ffaa6d550076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:33Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k47hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ggxxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.398664 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c823004-cd7d-4cea-9cdb-b44a806264ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7jw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.412274 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417740d6-e9c9-4fa8-9811-c6704b5b5692\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"},\\\"containerID\\\":\\\"cri-o://0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:33Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"},\\\"containerID\\\":\\\"cri-o://26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:33Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-wqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.425271 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.435180 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b45c859-3d05-4214-9bd3-2952546f5dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp7pt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.451244 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f335ad31-84ab-4bea-b0f2-75eca434a55d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://7901646b9904fb6a100644a0aacd978a71373a764eea536a29abd51530037c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://850d2c82f02982ba13abc9b9365f5be589329d37001cd14054004a85c6d2e96a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ad74805d602978f7f836e80d83b2ef81f7c4cbbc65155a2268057928abb2f906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://7696ac1bb100db5ea88c53ca29f38064d11d2600b968872de07c222ad6411720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e7c8113246b3ae53a45615d0812c94a8ef10af18f75a421afdf0afa3ecb09223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.461234 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e576e89-2381-4f76-a33a-bcf82fa79b03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a528e6a13bd818b7d3bb0ef864934913eb0b6b9e7573f8f7840799a03c87c0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.470517 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.480734 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-99gtj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"542903c2-fc88-4085-979a-db3766958392\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-djfvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-99gtj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.488324 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.488621 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.488829 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.488966 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.488989 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:35Z","lastTransitionTime":"2026-03-12T16:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.489922 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-ggxxl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1239377-fc5d-40f2-b262-0b9c9448a3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"},\\\"containerID\\\":\\\"cri-o://6fa3c07e34957365215009c4295fb02cb341addedfb591a62a97ffaa6d550076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:33Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k47hx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ggxxl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.500782 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tnk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c823004-cd7d-4cea-9cdb-b44a806264ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"10Mi\\\"},\\\"containerID\\\":\\\"cri-o://9563eb132af8f5414408d6752dc83e1c23365df4d88b94a4170577e484fdcc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"10Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:34Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":1001}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7jw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tnk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.509668 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417740d6-e9c9-4fa8-9811-c6704b5b5692\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"},\\\"containerID\\\":\\\"cri-o://0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:33Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"},\\\"containerID\\\":\\\"cri-o://26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:52:33Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wf2p5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-wqfhs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.519898 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.530759 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b45c859-3d05-4214-9bd3-2952546f5dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ljt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cp7pt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.547054 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f335ad31-84ab-4bea-b0f2-75eca434a55d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://7901646b9904fb6a100644a0aacd978a71373a764eea536a29abd51530037c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://850d2c82f02982ba13abc9b9365f5be589329d37001cd14054004a85c6d2e96a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ad74805d602978f7f836e80d83b2ef81f7c4cbbc65155a2268057928abb2f906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://7696ac1bb100db5ea88c53ca29f38064d11d2600b968872de07c222ad6411720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:03Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://e7c8113246b3ae53a45615d0812c94a8ef10af18f75a421afdf0afa3ecb09223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:02Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://372fea65170de54e0c9231f60e0ff1ead89a30cc6fc9d1ab1eec694591f285d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b51a266617c16a330f9314d4f763ccae3a4c157aecabbeec95199db504e6d95e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d9173a97b6d597333bec66e74940ae9e9effd207401a61fc5c529983637156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.555888 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e576e89-2381-4f76-a33a-bcf82fa79b03\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a528e6a13bd818b7d3bb0ef864934913eb0b6b9e7573f8f7840799a03c87c0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19f28bf51e41bc5de4afc2b3209eb8a889c06546b1d5a2e0ceaed8c52ee8867a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.565340 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.579448 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.592653 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.592697 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.592709 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.592726 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.592737 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:35Z","lastTransitionTime":"2026-03-12T16:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.593417 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766663a7-2c04-43da-a76f-dfacc5b1583a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4a51f5c81c52e6ebac9714cf00223e92acc2c4808c05d751bcf2475c3abced8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4a51f5c81c52e6ebac9714cf00223e92acc2c4808c05d751bcf2475c3abced8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:52:34Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trmc6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ckfz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.601647 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwc2f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vxc4c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.619529 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"resources\\\":{},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:52:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:52:34Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qz4cp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:52:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-6bpj2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.633060 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff799ef2-41aa-4972-ae8f-6e29c01bbd76\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d25459743a8e939a6fc0c89681dd4be8f2dbe697494adb6502228b2569ba616f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a1d4060b0813ec05d1dff25751605cd6ce575df8a1a0788b3331780009447967\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1b42bc08c18390a5c946037634004c7dcb6cb14b92f56220260d8237aeedd629\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.642573 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2beca127-92c3-4737-a680-69e0bf3936a5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cb2f9a89582795adf9b1f2e114f29fbcca43ccc0ac07b56136f3c09a99af6c2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ca237a33c864a01251750a3a9498ffbf76c02d7c465fec64514b916084eb3a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://2bfb723f8c449cda9730d31e02d633c5bc26368677283970a7d7977e8b14823c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac72b67adf21d403862dd2fc6e1a23c70ce40f83d6700823e7517d3ac39a3313\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.656761 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"766f2ece-d155-473b-bc1e-ceca5d270675\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:51:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T16:50:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:00Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T16:51:55Z\\\",\\\"message\\\":\\\"vvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InOrderInformers\\\\\\\" enabled=true\\\\nW0312 16:51:55.515939 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0312 16:51:55.516181 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0312 16:51:55.517475 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1361147659/tls.crt::/tmp/serving-cert-1361147659/tls.key\\\\\\\"\\\\nI0312 16:51:55.968978 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0312 16:51:55.972281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0312 16:51:55.972308 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0312 16:51:55.972347 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0312 16:51:55.972358 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0312 16:51:55.979594 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0312 16:51:55.979642 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0312 16:51:55.979645 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979674 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0312 16:51:55.979687 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0312 16:51:55.979702 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0312 16:51:55.979708 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0312 16:51:55.979715 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0312 16:51:55.981661 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T16:51:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T16:51:01Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T16:50:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T16:50:59Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T16:50:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.668740 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.681185 5184 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T16:52:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.694512 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.694559 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.694572 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.694589 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.694602 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:35Z","lastTransitionTime":"2026-03-12T16:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.796181 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.796212 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.796221 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.796234 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.796242 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:35Z","lastTransitionTime":"2026-03-12T16:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.898052 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.898095 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.898107 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.898123 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:35 crc kubenswrapper[5184]: I0312 16:52:35.898136 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:35Z","lastTransitionTime":"2026-03-12T16:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.000093 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.000139 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.000152 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.000167 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.000178 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:36Z","lastTransitionTime":"2026-03-12T16:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.101984 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.102020 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.102030 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.102043 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.102052 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:36Z","lastTransitionTime":"2026-03-12T16:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.154960 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.155009 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.155043 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.155071 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155189 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155205 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155216 5184 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155274 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.155256284 +0000 UTC m=+114.696567623 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155641 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155654 5184 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155663 5184 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155693 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.155684678 +0000 UTC m=+114.696996017 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155740 5184 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155768 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.15576003 +0000 UTC m=+114.697071369 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155800 5184 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.155827 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.155819872 +0000 UTC m=+114.697131211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.203510 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.203550 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.203562 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.203579 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.203589 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:36Z","lastTransitionTime":"2026-03-12T16:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.234104 5184 generic.go:358] "Generic (PLEG): container finished" podID="766663a7-2c04-43da-a76f-dfacc5b1583a" containerID="fcf5ed66bc4e250fcc55f6e44255a54ab3b0100f6f8f8a8eaf4611761724e776" exitCode=0 Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.234170 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" event={"ID":"766663a7-2c04-43da-a76f-dfacc5b1583a","Type":"ContainerDied","Data":"fcf5ed66bc4e250fcc55f6e44255a54ab3b0100f6f8f8a8eaf4611761724e776"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.238414 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"cae637125db7ffa43cd52d892127a8fcef033085d495846d8686aa02a734dc01"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.238445 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"95ed8f77eada4f83d193ee6da2ad58aec90fe917a95051ef1fa520dc25defd7c"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.242179 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerStarted","Data":"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.242210 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerStarted","Data":"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.242222 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerStarted","Data":"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.242235 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerStarted","Data":"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.242246 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerStarted","Data":"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.242259 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerStarted","Data":"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.256212 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.256458 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.256953 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.256935169 +0000 UTC m=+114.798246508 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.258264 5184 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.258308 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs podName:024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.258300071 +0000 UTC m=+114.799611400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs") pod "network-metrics-daemon-vxc4c" (UID: "024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.295882 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=17.295864606 podStartE2EDuration="17.295864606s" podCreationTimestamp="2026-03-12 16:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:36.279455727 +0000 UTC m=+98.820767076" watchObservedRunningTime="2026-03-12 16:52:36.295864606 +0000 UTC m=+98.837175945" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.297232 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=17.297224358 podStartE2EDuration="17.297224358s" podCreationTimestamp="2026-03-12 16:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:36.295826645 +0000 UTC m=+98.837137994" watchObservedRunningTime="2026-03-12 16:52:36.297224358 +0000 UTC m=+98.838535697" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.308450 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.308532 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.308589 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.310423 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.310445 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:36Z","lastTransitionTime":"2026-03-12T16:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.399907 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.400000 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.401030 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.401085 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.401155 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.401246 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.401537 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:36 crc kubenswrapper[5184]: E0312 16:52:36.401756 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.413920 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.413962 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.413975 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.413992 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.414003 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:36Z","lastTransitionTime":"2026-03-12T16:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.444906 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=17.444879519 podStartE2EDuration="17.444879519s" podCreationTimestamp="2026-03-12 16:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:36.418582533 +0000 UTC m=+98.959893902" watchObservedRunningTime="2026-03-12 16:52:36.444879519 +0000 UTC m=+98.986190868" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.470607 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=17.470590106 podStartE2EDuration="17.470590106s" podCreationTimestamp="2026-03-12 16:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:36.446278432 +0000 UTC m=+98.987589791" watchObservedRunningTime="2026-03-12 16:52:36.470590106 +0000 UTC m=+99.011901445" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.519845 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.519878 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.519886 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.519900 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.519911 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:36Z","lastTransitionTime":"2026-03-12T16:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.548042 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ggxxl" podStartSLOduration=77.548021738 podStartE2EDuration="1m17.548021738s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:36.547655537 +0000 UTC m=+99.088966876" watchObservedRunningTime="2026-03-12 16:52:36.548021738 +0000 UTC m=+99.089333087" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.559533 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tnk2c" podStartSLOduration=77.559511735 podStartE2EDuration="1m17.559511735s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:36.559074151 +0000 UTC m=+99.100385500" watchObservedRunningTime="2026-03-12 16:52:36.559511735 +0000 UTC m=+99.100823074" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.577254 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" podStartSLOduration=76.577237534 podStartE2EDuration="1m16.577237534s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:36.576820792 +0000 UTC m=+99.118132131" watchObservedRunningTime="2026-03-12 16:52:36.577237534 +0000 UTC m=+99.118548873" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.622404 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.622451 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.622464 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.622478 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.622490 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:36Z","lastTransitionTime":"2026-03-12T16:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.724115 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.724184 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.724200 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.724223 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.724235 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:36Z","lastTransitionTime":"2026-03-12T16:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.827080 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.827125 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.827135 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.827149 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.827167 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:36Z","lastTransitionTime":"2026-03-12T16:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.929228 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.929273 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.929285 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.929299 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:36 crc kubenswrapper[5184]: I0312 16:52:36.929308 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:36Z","lastTransitionTime":"2026-03-12T16:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.031762 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.031828 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.031851 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.031876 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.031893 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:37Z","lastTransitionTime":"2026-03-12T16:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.134264 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.134325 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.134345 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.134369 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.134436 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:37Z","lastTransitionTime":"2026-03-12T16:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.236955 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.237042 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.237071 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.237103 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.237130 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:37Z","lastTransitionTime":"2026-03-12T16:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.262134 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"6b37f3463afde9fdc0b39c6bf807ff77483b5be7785376c684d6805a8ba9b0d0"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.262262 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"a794500127db524b745f6dfb40cb4c4c83a065628e7edf1a8c68e379958a7834"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.264597 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-99gtj" event={"ID":"542903c2-fc88-4085-979a-db3766958392","Type":"ContainerStarted","Data":"1c3df7e5ebfd17fac7029a70d11086adf8244115be119b9f83d90982ffede7fd"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.271764 5184 generic.go:358] "Generic (PLEG): container finished" podID="766663a7-2c04-43da-a76f-dfacc5b1583a" containerID="8266591d9e68b8349d15117d362a840ed2816270243f7412f57d8766ba101dc0" exitCode=0 Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.271882 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" event={"ID":"766663a7-2c04-43da-a76f-dfacc5b1583a","Type":"ContainerDied","Data":"8266591d9e68b8349d15117d362a840ed2816270243f7412f57d8766ba101dc0"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.279094 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" event={"ID":"34177974-8d82-49d2-a763-391d0df3bbd8","Type":"ContainerStarted","Data":"b9378b48361de191a33c368d1770cf79977f4f82666fcfb844ab884bb8aecb9c"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.331229 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podStartSLOduration=78.331198663 podStartE2EDuration="1m18.331198663s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:37.289473438 +0000 UTC m=+99.830784827" watchObservedRunningTime="2026-03-12 16:52:37.331198663 +0000 UTC m=+99.872510052" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.343169 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.343238 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.343252 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.343270 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.343284 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:37Z","lastTransitionTime":"2026-03-12T16:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.374582 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-99gtj" podStartSLOduration=78.374565788 podStartE2EDuration="1m18.374565788s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:37.371977707 +0000 UTC m=+99.913289046" watchObservedRunningTime="2026-03-12 16:52:37.374565788 +0000 UTC m=+99.915877127" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.445962 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.446000 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.446009 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.446022 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.446030 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:37Z","lastTransitionTime":"2026-03-12T16:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.548907 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.549205 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.549214 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.549228 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.549255 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:37Z","lastTransitionTime":"2026-03-12T16:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.651912 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.651967 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.651984 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.652002 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.652014 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:37Z","lastTransitionTime":"2026-03-12T16:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.754286 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.754643 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.754653 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.754668 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.754677 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:37Z","lastTransitionTime":"2026-03-12T16:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.856435 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.856494 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.856512 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.856537 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.856555 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:37Z","lastTransitionTime":"2026-03-12T16:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.958611 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.958655 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.958664 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.958676 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:37 crc kubenswrapper[5184]: I0312 16:52:37.958684 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:37Z","lastTransitionTime":"2026-03-12T16:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.060901 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.060954 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.060969 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.060990 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.061005 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:38Z","lastTransitionTime":"2026-03-12T16:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.162995 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.163208 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.163259 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.163278 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.163290 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:38Z","lastTransitionTime":"2026-03-12T16:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.265848 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.265881 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.265889 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.265903 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.265912 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:38Z","lastTransitionTime":"2026-03-12T16:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.286065 5184 generic.go:358] "Generic (PLEG): container finished" podID="766663a7-2c04-43da-a76f-dfacc5b1583a" containerID="a9945b86d7aaee967fd1b384abc8c3bb65442c59c3033a813e26a8e54eb9db1c" exitCode=0 Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.286193 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" event={"ID":"766663a7-2c04-43da-a76f-dfacc5b1583a","Type":"ContainerDied","Data":"a9945b86d7aaee967fd1b384abc8c3bb65442c59c3033a813e26a8e54eb9db1c"} Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.291344 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerStarted","Data":"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f"} Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.367932 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.367978 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.368018 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.368033 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.368041 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:38Z","lastTransitionTime":"2026-03-12T16:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.400109 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.400260 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:38 crc kubenswrapper[5184]: E0312 16:52:38.400325 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:38 crc kubenswrapper[5184]: E0312 16:52:38.400338 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.400352 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:38 crc kubenswrapper[5184]: E0312 16:52:38.400497 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.400652 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:38 crc kubenswrapper[5184]: E0312 16:52:38.400758 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.413203 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.413242 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.413255 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.413271 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.413283 5184 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T16:52:38Z","lastTransitionTime":"2026-03-12T16:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.451817 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww"] Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.458451 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.461170 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\"" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.461193 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"default-dockercfg-hqpm5\"" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.461595 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\"" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.461811 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\"" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.484514 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/81670abd-778d-4435-b024-de3a5ba7d4fb-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.484583 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81670abd-778d-4435-b024-de3a5ba7d4fb-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.484601 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/81670abd-778d-4435-b024-de3a5ba7d4fb-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.484672 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/81670abd-778d-4435-b024-de3a5ba7d4fb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.484776 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81670abd-778d-4435-b024-de3a5ba7d4fb-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.586290 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/81670abd-778d-4435-b024-de3a5ba7d4fb-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.586531 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81670abd-778d-4435-b024-de3a5ba7d4fb-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.586551 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/81670abd-778d-4435-b024-de3a5ba7d4fb-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.586575 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/81670abd-778d-4435-b024-de3a5ba7d4fb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.586450 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/81670abd-778d-4435-b024-de3a5ba7d4fb-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.586622 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81670abd-778d-4435-b024-de3a5ba7d4fb-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.586739 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/81670abd-778d-4435-b024-de3a5ba7d4fb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.587510 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/81670abd-778d-4435-b024-de3a5ba7d4fb-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.592167 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81670abd-778d-4435-b024-de3a5ba7d4fb-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.601273 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81670abd-778d-4435-b024-de3a5ba7d4fb-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-xvtww\" (UID: \"81670abd-778d-4435-b024-de3a5ba7d4fb\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: I0312 16:52:38.777784 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" Mar 12 16:52:38 crc kubenswrapper[5184]: W0312 16:52:38.798199 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81670abd_778d_4435_b024_de3a5ba7d4fb.slice/crio-f07a8c1330a158dd0b9794fd5a0eba7e706867d3b5dc8f039f3e0b59230f5705 WatchSource:0}: Error finding container f07a8c1330a158dd0b9794fd5a0eba7e706867d3b5dc8f039f3e0b59230f5705: Status 404 returned error can't find the container with id f07a8c1330a158dd0b9794fd5a0eba7e706867d3b5dc8f039f3e0b59230f5705 Mar 12 16:52:39 crc kubenswrapper[5184]: I0312 16:52:39.297139 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" event={"ID":"766663a7-2c04-43da-a76f-dfacc5b1583a","Type":"ContainerStarted","Data":"0c823b5844390bdbea746306d053e896850bf1c864436ca2217a99b351c67f1d"} Mar 12 16:52:39 crc kubenswrapper[5184]: I0312 16:52:39.299448 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" event={"ID":"81670abd-778d-4435-b024-de3a5ba7d4fb","Type":"ContainerStarted","Data":"f9131f7e03f048081d2ca9763cfab7fc2fb5ba0cc3c1ff317e3a65c826e52b73"} Mar 12 16:52:39 crc kubenswrapper[5184]: I0312 16:52:39.299480 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" event={"ID":"81670abd-778d-4435-b024-de3a5ba7d4fb","Type":"ContainerStarted","Data":"f07a8c1330a158dd0b9794fd5a0eba7e706867d3b5dc8f039f3e0b59230f5705"} Mar 12 16:52:39 crc kubenswrapper[5184]: I0312 16:52:39.351286 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-xvtww" podStartSLOduration=80.351257055 podStartE2EDuration="1m20.351257055s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:39.351006338 +0000 UTC m=+101.892317697" watchObservedRunningTime="2026-03-12 16:52:39.351257055 +0000 UTC m=+101.892568404" Mar 12 16:52:39 crc kubenswrapper[5184]: I0312 16:52:39.366398 5184 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 12 16:52:39 crc kubenswrapper[5184]: I0312 16:52:39.373517 5184 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 12 16:52:40 crc kubenswrapper[5184]: I0312 16:52:40.306614 5184 generic.go:358] "Generic (PLEG): container finished" podID="766663a7-2c04-43da-a76f-dfacc5b1583a" containerID="0c823b5844390bdbea746306d053e896850bf1c864436ca2217a99b351c67f1d" exitCode=0 Mar 12 16:52:40 crc kubenswrapper[5184]: I0312 16:52:40.306713 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" event={"ID":"766663a7-2c04-43da-a76f-dfacc5b1583a","Type":"ContainerDied","Data":"0c823b5844390bdbea746306d053e896850bf1c864436ca2217a99b351c67f1d"} Mar 12 16:52:40 crc kubenswrapper[5184]: I0312 16:52:40.308870 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" event={"ID":"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc","Type":"ContainerStarted","Data":"273e38252c504f3c4fdcb290b0d6dba3694ee776dcae91a571426fe67b00e71d"} Mar 12 16:52:40 crc kubenswrapper[5184]: I0312 16:52:40.398754 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:40 crc kubenswrapper[5184]: I0312 16:52:40.398792 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:40 crc kubenswrapper[5184]: E0312 16:52:40.398885 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:40 crc kubenswrapper[5184]: I0312 16:52:40.398908 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:40 crc kubenswrapper[5184]: I0312 16:52:40.398753 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:40 crc kubenswrapper[5184]: E0312 16:52:40.399029 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:40 crc kubenswrapper[5184]: E0312 16:52:40.399095 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:40 crc kubenswrapper[5184]: E0312 16:52:40.399166 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:41 crc kubenswrapper[5184]: I0312 16:52:41.318512 5184 generic.go:358] "Generic (PLEG): container finished" podID="766663a7-2c04-43da-a76f-dfacc5b1583a" containerID="daee6356d7d347dd7420b821d6c6f8a7c7d0cb77089e528d222ae372d01285c6" exitCode=0 Mar 12 16:52:41 crc kubenswrapper[5184]: I0312 16:52:41.318579 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" event={"ID":"766663a7-2c04-43da-a76f-dfacc5b1583a","Type":"ContainerDied","Data":"daee6356d7d347dd7420b821d6c6f8a7c7d0cb77089e528d222ae372d01285c6"} Mar 12 16:52:41 crc kubenswrapper[5184]: I0312 16:52:41.324780 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerStarted","Data":"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42"} Mar 12 16:52:41 crc kubenswrapper[5184]: I0312 16:52:41.333346 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:41 crc kubenswrapper[5184]: I0312 16:52:41.333441 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:41 crc kubenswrapper[5184]: I0312 16:52:41.415334 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:41 crc kubenswrapper[5184]: I0312 16:52:41.484467 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podStartSLOduration=81.484437128 podStartE2EDuration="1m21.484437128s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:41.429884325 +0000 UTC m=+103.971195724" watchObservedRunningTime="2026-03-12 16:52:41.484437128 +0000 UTC m=+104.025748507" Mar 12 16:52:42 crc kubenswrapper[5184]: I0312 16:52:42.332325 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" event={"ID":"766663a7-2c04-43da-a76f-dfacc5b1583a","Type":"ContainerStarted","Data":"a73db0ada7ea3a42c0495b4d42a685d44bc51eb9abcd96898f85e4f51601d322"} Mar 12 16:52:42 crc kubenswrapper[5184]: I0312 16:52:42.333754 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:42 crc kubenswrapper[5184]: I0312 16:52:42.375296 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:52:42 crc kubenswrapper[5184]: I0312 16:52:42.402339 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:42 crc kubenswrapper[5184]: E0312 16:52:42.402516 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:42 crc kubenswrapper[5184]: I0312 16:52:42.402972 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:42 crc kubenswrapper[5184]: E0312 16:52:42.403070 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:42 crc kubenswrapper[5184]: I0312 16:52:42.403196 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:42 crc kubenswrapper[5184]: E0312 16:52:42.403310 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:42 crc kubenswrapper[5184]: I0312 16:52:42.403417 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:42 crc kubenswrapper[5184]: E0312 16:52:42.403582 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:42 crc kubenswrapper[5184]: I0312 16:52:42.420465 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ckfz2" podStartSLOduration=83.420437492 podStartE2EDuration="1m23.420437492s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:42.374086945 +0000 UTC m=+104.915398304" watchObservedRunningTime="2026-03-12 16:52:42.420437492 +0000 UTC m=+104.961748851" Mar 12 16:52:42 crc kubenswrapper[5184]: I0312 16:52:42.887743 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vxc4c"] Mar 12 16:52:43 crc kubenswrapper[5184]: I0312 16:52:43.334218 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:43 crc kubenswrapper[5184]: E0312 16:52:43.334320 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:44 crc kubenswrapper[5184]: I0312 16:52:44.399683 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:44 crc kubenswrapper[5184]: E0312 16:52:44.399843 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:44 crc kubenswrapper[5184]: I0312 16:52:44.399973 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:44 crc kubenswrapper[5184]: I0312 16:52:44.400034 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:44 crc kubenswrapper[5184]: E0312 16:52:44.400187 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:44 crc kubenswrapper[5184]: I0312 16:52:44.400616 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:44 crc kubenswrapper[5184]: E0312 16:52:44.400731 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:44 crc kubenswrapper[5184]: E0312 16:52:44.400820 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:46 crc kubenswrapper[5184]: I0312 16:52:46.398802 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:46 crc kubenswrapper[5184]: I0312 16:52:46.398834 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:46 crc kubenswrapper[5184]: I0312 16:52:46.398846 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:46 crc kubenswrapper[5184]: I0312 16:52:46.398807 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:46 crc kubenswrapper[5184]: E0312 16:52:46.398982 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 12 16:52:46 crc kubenswrapper[5184]: E0312 16:52:46.399074 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 12 16:52:46 crc kubenswrapper[5184]: E0312 16:52:46.399371 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vxc4c" podUID="024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df" Mar 12 16:52:46 crc kubenswrapper[5184]: E0312 16:52:46.399499 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 12 16:52:47 crc kubenswrapper[5184]: I0312 16:52:47.399949 5184 scope.go:117] "RemoveContainer" containerID="c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4" Mar 12 16:52:47 crc kubenswrapper[5184]: I0312 16:52:47.868012 5184 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeReady" Mar 12 16:52:47 crc kubenswrapper[5184]: I0312 16:52:47.868537 5184 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 12 16:52:47 crc kubenswrapper[5184]: I0312 16:52:47.919801 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-j4gpx"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.204812 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-bbnrv"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.204967 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.210426 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.211187 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-d29hz"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.211529 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"etcd-client\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.211883 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.212015 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"etcd-serving-ca\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.211281 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"encryption-config-1\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.212761 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"image-import-ca\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.213004 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"audit-1\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.211196 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"config\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.213256 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-4zqgh\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.215870 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.216142 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.216544 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.216868 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.222327 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"client-ca\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.223640 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.224468 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.226131 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.226191 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-images\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.226198 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.226317 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.226441 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-tls\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.226978 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"config\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.227095 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-6n5ln\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.227099 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-rbac-proxy\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.228188 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-djmfg\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.228782 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.228950 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.233744 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.234831 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-mmcpt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.234984 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"config\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.235110 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.235235 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.235365 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"client-ca\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.235412 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.236252 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"trusted-ca-bundle\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.236623 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.239794 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.240029 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.244176 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.244327 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.245470 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.245586 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.245855 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.245922 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-wzhvk\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.245954 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.246077 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.246088 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"service-ca-bundle\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.246121 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.246159 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-client\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.249864 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-qmxgv"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.250192 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.250623 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"audit-1\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.252079 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.252371 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.252825 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-qqw4z\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.253201 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.254177 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.254669 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-6tbpn\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.254795 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.255050 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.255467 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.255901 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-config\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.256959 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.257094 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-rqzhf"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.257585 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.264315 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-global-ca\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.265143 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.266273 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-6c46w\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.267633 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-config\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.268681 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.268898 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.269134 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.269327 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"audit\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.269345 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.269432 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.269624 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.269630 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.269912 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.270686 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-747b44746d-t6987"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.270837 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.271899 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.273131 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-session\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.273528 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.273741 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.274361 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-d2bf2\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.274534 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.277995 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"dns-operator-dockercfg-wbbsn\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.279961 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.282609 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.282840 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-747b44746d-t6987" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.284913 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"metrics-tls\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.285066 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.286293 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.286910 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-dlsx9"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.288621 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mdwwj\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.288865 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.289471 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.292811 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.293860 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.296843 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-67c89758df-ssqr6"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.297239 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.297855 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.298633 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-config\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.298836 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-auth-proxy-config\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.299059 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027a90bb-52c1-43ed-a43d-6f9755019c9b-config\") pod \"openshift-apiserver-operator-846cbfc458-mpvq4\" (UID: \"027a90bb-52c1-43ed-a43d-6f9755019c9b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.299216 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.299560 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.299746 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beb08b86-8593-4511-8bce-ea5f1d44f795-serving-cert\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.299891 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.299903 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5db01dce-a574-4dbd-97a9-582f0f357bda-etcd-client\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.300209 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.300355 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea99c433-2166-47f6-8c55-0787f78ff608-serving-cert\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.300577 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdp9g\" (UniqueName: \"kubernetes.io/projected/beb08b86-8593-4511-8bce-ea5f1d44f795-kube-api-access-pdp9g\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.300725 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmgq\" (UniqueName: \"kubernetes.io/projected/5db01dce-a574-4dbd-97a9-582f0f357bda-kube-api-access-fsmgq\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.300890 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.301061 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1af1595b-1a79-438d-99a0-dd34b32cfcda-tmp-dir\") pod \"dns-operator-799b87ffcd-rqzhf\" (UID: \"1af1595b-1a79-438d-99a0-dd34b32cfcda\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.301253 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.301537 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5db01dce-a574-4dbd-97a9-582f0f357bda-etcd-serving-ca\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.301762 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/beb08b86-8593-4511-8bce-ea5f1d44f795-node-pullsecrets\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.301928 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-config\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.302099 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpvw\" (UniqueName: \"kubernetes.io/projected/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-kube-api-access-xqpvw\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.302321 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1d9df18-d5a1-447d-ad5a-fdef055a830a-tmp\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.302559 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-dir\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.302720 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-machine-approver-tls\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.302858 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-config\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.302946 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-images\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.303056 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sbfx\" (UniqueName: \"kubernetes.io/projected/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-kube-api-access-4sbfx\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.303169 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrbv2\" (UniqueName: \"kubernetes.io/projected/a1d9df18-d5a1-447d-ad5a-fdef055a830a-kube-api-access-zrbv2\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.303260 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s92\" (UniqueName: \"kubernetes.io/projected/ea99c433-2166-47f6-8c55-0787f78ff608-kube-api-access-h8s92\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.303355 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.303471 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-image-import-ca\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.303579 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhtqd\" (UniqueName: \"kubernetes.io/projected/1af1595b-1a79-438d-99a0-dd34b32cfcda-kube-api-access-vhtqd\") pod \"dns-operator-799b87ffcd-rqzhf\" (UID: \"1af1595b-1a79-438d-99a0-dd34b32cfcda\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.303669 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.303771 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/beb08b86-8593-4511-8bce-ea5f1d44f795-audit-dir\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.303856 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.303944 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5db01dce-a574-4dbd-97a9-582f0f357bda-encryption-config\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.304046 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/027a90bb-52c1-43ed-a43d-6f9755019c9b-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-mpvq4\" (UID: \"027a90bb-52c1-43ed-a43d-6f9755019c9b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.304136 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.304246 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/beb08b86-8593-4511-8bce-ea5f1d44f795-etcd-client\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.304340 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17eed63d-a9fc-414e-9c70-347b51893cfa-serving-cert\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.304460 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-client-ca\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.304568 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea99c433-2166-47f6-8c55-0787f78ff608-config\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.304660 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-policies\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.304750 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.304855 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw42l\" (UniqueName: \"kubernetes.io/projected/027a90bb-52c1-43ed-a43d-6f9755019c9b-kube-api-access-qw42l\") pod \"openshift-apiserver-operator-846cbfc458-mpvq4\" (UID: \"027a90bb-52c1-43ed-a43d-6f9755019c9b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.304967 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mj6b\" (UniqueName: \"kubernetes.io/projected/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-kube-api-access-5mj6b\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.305068 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1d9df18-d5a1-447d-ad5a-fdef055a830a-serving-cert\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.305153 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1af1595b-1a79-438d-99a0-dd34b32cfcda-metrics-tls\") pod \"dns-operator-799b87ffcd-rqzhf\" (UID: \"1af1595b-1a79-438d-99a0-dd34b32cfcda\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.305914 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.300766 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.308172 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.308416 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.311747 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-operator-tls\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.311949 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.312166 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"cluster-image-registry-operator-dockercfg-ntnd7\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.312660 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.325359 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.325768 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.325885 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/beb08b86-8593-4511-8bce-ea5f1d44f795-encryption-config\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.325936 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5db01dce-a574-4dbd-97a9-582f0f357bda-audit-dir\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.325976 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea99c433-2166-47f6-8c55-0787f78ff608-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326014 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326051 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-config\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326141 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-client-ca\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326395 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db01dce-a574-4dbd-97a9-582f0f357bda-trusted-ca-bundle\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326437 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea99c433-2166-47f6-8c55-0787f78ff608-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326465 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-audit\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326496 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326528 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5db01dce-a574-4dbd-97a9-582f0f357bda-serving-cert\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326573 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326608 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17eed63d-a9fc-414e-9c70-347b51893cfa-tmp\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326640 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326841 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326886 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mfvq\" (UniqueName: \"kubernetes.io/projected/17eed63d-a9fc-414e-9c70-347b51893cfa-kube-api-access-7mfvq\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326913 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-config\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.326949 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5db01dce-a574-4dbd-97a9-582f0f357bda-audit-policies\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.328113 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.328346 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kl6m8\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.328777 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.329229 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.329493 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.331878 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.333491 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jmhxf\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.335079 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.338906 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.340198 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.343802 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-n9g8v"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.344245 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.344562 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.361582 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.361772 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.361896 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.362116 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2h6bs\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.362655 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.366742 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.367915 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-68cf44c8b8-7pgjs"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.368080 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-dockercfg-2wbn2\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.368443 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.369325 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.372151 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c"} Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.372179 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.374553 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.375717 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.375810 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.379814 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-l8cgq"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.380591 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.383598 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.383701 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.387238 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-scheduler-operator-serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.394979 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.396802 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.398230 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.398265 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.398466 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.403774 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.403792 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.403896 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.404059 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.404148 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.410133 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-config\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.420939 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.427226 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.427495 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.428675 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea99c433-2166-47f6-8c55-0787f78ff608-config\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.428781 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-policies\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.428878 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.428979 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/70e9334c-b259-45e5-88a3-6909ce233bda-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.429349 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sb26\" (UniqueName: \"kubernetes.io/projected/bbfdedba-967f-4b86-b7bd-a81854132b50-kube-api-access-5sb26\") pod \"openshift-config-operator-5777786469-n9g8v\" (UID: \"bbfdedba-967f-4b86-b7bd-a81854132b50\") " pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.429477 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw42l\" (UniqueName: \"kubernetes.io/projected/027a90bb-52c1-43ed-a43d-6f9755019c9b-kube-api-access-qw42l\") pod \"openshift-apiserver-operator-846cbfc458-mpvq4\" (UID: \"027a90bb-52c1-43ed-a43d-6f9755019c9b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.429576 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mj6b\" (UniqueName: \"kubernetes.io/projected/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-kube-api-access-5mj6b\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.429675 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1d9df18-d5a1-447d-ad5a-fdef055a830a-serving-cert\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.429778 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1af1595b-1a79-438d-99a0-dd34b32cfcda-metrics-tls\") pod \"dns-operator-799b87ffcd-rqzhf\" (UID: \"1af1595b-1a79-438d-99a0-dd34b32cfcda\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.429918 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.430876 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/beb08b86-8593-4511-8bce-ea5f1d44f795-encryption-config\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.430963 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5db01dce-a574-4dbd-97a9-582f0f357bda-audit-dir\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.430502 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea99c433-2166-47f6-8c55-0787f78ff608-config\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.430473 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.431163 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.431248 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5db01dce-a574-4dbd-97a9-582f0f357bda-audit-dir\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.430161 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-policies\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.431478 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.433108 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea99c433-2166-47f6-8c55-0787f78ff608-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.433180 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.433230 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-config\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.433268 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/70e9334c-b259-45e5-88a3-6909ce233bda-tmp\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.433335 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-client-ca\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.434344 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-client-ca\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435465 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db01dce-a574-4dbd-97a9-582f0f357bda-trusted-ca-bundle\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435508 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea99c433-2166-47f6-8c55-0787f78ff608-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435528 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-config\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435533 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea99c433-2166-47f6-8c55-0787f78ff608-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435598 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-audit\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435642 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435677 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5db01dce-a574-4dbd-97a9-582f0f357bda-serving-cert\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435706 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435735 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17eed63d-a9fc-414e-9c70-347b51893cfa-tmp\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435771 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scx52\" (UniqueName: \"kubernetes.io/projected/70e9334c-b259-45e5-88a3-6909ce233bda-kube-api-access-scx52\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435805 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435836 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435883 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mfvq\" (UniqueName: \"kubernetes.io/projected/17eed63d-a9fc-414e-9c70-347b51893cfa-kube-api-access-7mfvq\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435911 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-config\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435946 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5db01dce-a574-4dbd-97a9-582f0f357bda-audit-policies\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.435977 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bbfdedba-967f-4b86-b7bd-a81854132b50-available-featuregates\") pod \"openshift-config-operator-5777786469-n9g8v\" (UID: \"bbfdedba-967f-4b86-b7bd-a81854132b50\") " pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.436009 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-config\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.436024 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db01dce-a574-4dbd-97a9-582f0f357bda-trusted-ca-bundle\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.436040 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-auth-proxy-config\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.436072 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027a90bb-52c1-43ed-a43d-6f9755019c9b-config\") pod \"openshift-apiserver-operator-846cbfc458-mpvq4\" (UID: \"027a90bb-52c1-43ed-a43d-6f9755019c9b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.436191 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea99c433-2166-47f6-8c55-0787f78ff608-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.437364 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5db01dce-a574-4dbd-97a9-582f0f357bda-audit-policies\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.437505 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/beb08b86-8593-4511-8bce-ea5f1d44f795-encryption-config\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.437913 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17eed63d-a9fc-414e-9c70-347b51893cfa-tmp\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.438568 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/027a90bb-52c1-43ed-a43d-6f9755019c9b-config\") pod \"openshift-apiserver-operator-846cbfc458-mpvq4\" (UID: \"027a90bb-52c1-43ed-a43d-6f9755019c9b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.439191 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-audit\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.439335 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5db01dce-a574-4dbd-97a9-582f0f357bda-serving-cert\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440042 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-config\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440291 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440364 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440414 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-config\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440423 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440488 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/781710ac-8789-42cf-983e-f7de329e4e81-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-p29gv\" (UID: \"781710ac-8789-42cf-983e-f7de329e4e81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440530 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrrqm\" (UniqueName: \"kubernetes.io/projected/781710ac-8789-42cf-983e-f7de329e4e81-kube-api-access-lrrqm\") pod \"kube-storage-version-migrator-operator-565b79b866-p29gv\" (UID: \"781710ac-8789-42cf-983e-f7de329e4e81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440602 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beb08b86-8593-4511-8bce-ea5f1d44f795-serving-cert\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440635 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5db01dce-a574-4dbd-97a9-582f0f357bda-etcd-client\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440671 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70e9334c-b259-45e5-88a3-6909ce233bda-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440706 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbfdedba-967f-4b86-b7bd-a81854132b50-serving-cert\") pod \"openshift-config-operator-5777786469-n9g8v\" (UID: \"bbfdedba-967f-4b86-b7bd-a81854132b50\") " pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440712 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.440769 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-auth-proxy-config\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.441184 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.441541 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.441556 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1af1595b-1a79-438d-99a0-dd34b32cfcda-metrics-tls\") pod \"dns-operator-799b87ffcd-rqzhf\" (UID: \"1af1595b-1a79-438d-99a0-dd34b32cfcda\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.441607 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p76p\" (UniqueName: \"kubernetes.io/projected/6f45ff33-e60b-4885-ac63-5ab182bf6320-kube-api-access-5p76p\") pod \"downloads-747b44746d-t6987\" (UID: \"6f45ff33-e60b-4885-ac63-5ab182bf6320\") " pod="openshift-console/downloads-747b44746d-t6987" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.441659 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.441704 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea99c433-2166-47f6-8c55-0787f78ff608-serving-cert\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.441913 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.442359 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.442520 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdp9g\" (UniqueName: \"kubernetes.io/projected/beb08b86-8593-4511-8bce-ea5f1d44f795-kube-api-access-pdp9g\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.442550 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444025 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.442557 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmgq\" (UniqueName: \"kubernetes.io/projected/5db01dce-a574-4dbd-97a9-582f0f357bda-kube-api-access-fsmgq\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444120 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beb08b86-8593-4511-8bce-ea5f1d44f795-serving-cert\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444202 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1af1595b-1a79-438d-99a0-dd34b32cfcda-tmp-dir\") pod \"dns-operator-799b87ffcd-rqzhf\" (UID: \"1af1595b-1a79-438d-99a0-dd34b32cfcda\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444251 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444306 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5db01dce-a574-4dbd-97a9-582f0f357bda-etcd-serving-ca\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444344 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/beb08b86-8593-4511-8bce-ea5f1d44f795-node-pullsecrets\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444434 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-config\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444487 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqpvw\" (UniqueName: \"kubernetes.io/projected/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-kube-api-access-xqpvw\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444530 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1d9df18-d5a1-447d-ad5a-fdef055a830a-tmp\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444600 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-dir\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444653 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-machine-approver-tls\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444687 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-config\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444727 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-images\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444747 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1af1595b-1a79-438d-99a0-dd34b32cfcda-tmp-dir\") pod \"dns-operator-799b87ffcd-rqzhf\" (UID: \"1af1595b-1a79-438d-99a0-dd34b32cfcda\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444767 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sbfx\" (UniqueName: \"kubernetes.io/projected/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-kube-api-access-4sbfx\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444817 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrbv2\" (UniqueName: \"kubernetes.io/projected/a1d9df18-d5a1-447d-ad5a-fdef055a830a-kube-api-access-zrbv2\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444853 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s92\" (UniqueName: \"kubernetes.io/projected/ea99c433-2166-47f6-8c55-0787f78ff608-kube-api-access-h8s92\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444890 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444932 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-image-import-ca\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.444965 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhtqd\" (UniqueName: \"kubernetes.io/projected/1af1595b-1a79-438d-99a0-dd34b32cfcda-kube-api-access-vhtqd\") pod \"dns-operator-799b87ffcd-rqzhf\" (UID: \"1af1595b-1a79-438d-99a0-dd34b32cfcda\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445003 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445044 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70e9334c-b259-45e5-88a3-6909ce233bda-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445101 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/beb08b86-8593-4511-8bce-ea5f1d44f795-audit-dir\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445136 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445169 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781710ac-8789-42cf-983e-f7de329e4e81-config\") pod \"kube-storage-version-migrator-operator-565b79b866-p29gv\" (UID: \"781710ac-8789-42cf-983e-f7de329e4e81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445208 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5db01dce-a574-4dbd-97a9-582f0f357bda-encryption-config\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445239 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/027a90bb-52c1-43ed-a43d-6f9755019c9b-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-mpvq4\" (UID: \"027a90bb-52c1-43ed-a43d-6f9755019c9b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445273 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445306 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/70e9334c-b259-45e5-88a3-6909ce233bda-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445347 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/beb08b86-8593-4511-8bce-ea5f1d44f795-etcd-client\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445397 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17eed63d-a9fc-414e-9c70-347b51893cfa-serving-cert\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445434 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-client-ca\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.445893 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1d9df18-d5a1-447d-ad5a-fdef055a830a-tmp\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.446011 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.446328 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/beb08b86-8593-4511-8bce-ea5f1d44f795-audit-dir\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.446513 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-dir\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.446572 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5db01dce-a574-4dbd-97a9-582f0f357bda-etcd-client\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.446610 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1d9df18-d5a1-447d-ad5a-fdef055a830a-serving-cert\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.447148 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.447151 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-config\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.448301 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5db01dce-a574-4dbd-97a9-582f0f357bda-etcd-serving-ca\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.448414 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/beb08b86-8593-4511-8bce-ea5f1d44f795-node-pullsecrets\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.448731 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-image-import-ca\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.449176 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-images\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.449762 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beb08b86-8593-4511-8bce-ea5f1d44f795-config\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.449947 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-client-ca\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.450338 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.450410 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.451301 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea99c433-2166-47f6-8c55-0787f78ff608-serving-cert\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.451359 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.453107 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-machine-approver-tls\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.453902 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5db01dce-a574-4dbd-97a9-582f0f357bda-encryption-config\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.454207 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17eed63d-a9fc-414e-9c70-347b51893cfa-serving-cert\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.454252 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.456027 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.457144 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/027a90bb-52c1-43ed-a43d-6f9755019c9b-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-mpvq4\" (UID: \"027a90bb-52c1-43ed-a43d-6f9755019c9b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.458786 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.459353 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/beb08b86-8593-4511-8bce-ea5f1d44f795-etcd-client\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.466947 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.468171 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"config-operator-serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.469922 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.470186 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.474677 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.474912 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.479293 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-74545575db-6qpvf"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.482555 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.482687 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.482709 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.487249 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.487350 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.487455 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-sjn6s\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.487370 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.495487 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-dpld6"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.495650 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.499511 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-d29hz"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.499593 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-kv7dd"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.499819 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.506076 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-j4gpx"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.506177 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-qmxgv"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.506266 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-rqzhf"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.506181 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.506434 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.506517 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.506597 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64d44f6ddf-qxthf"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.507728 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.511268 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.511367 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.518762 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.518791 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.518806 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-bbnrv"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.518820 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.518831 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.518847 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-n9g8v"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.518863 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5zvch"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.519204 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.527798 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kw8fx\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.528102 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.528116 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-dlsx9"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.528179 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533445 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533473 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533486 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-74545575db-6qpvf"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533499 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-747b44746d-t6987"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533509 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533521 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533532 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533547 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533561 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533577 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533589 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533602 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533615 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-l8cgq"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533626 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533638 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-67c89758df-ssqr6"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533647 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533652 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533792 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533810 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533821 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5zvch"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533830 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.533844 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tr5c8"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.538669 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-f2fdq"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.538816 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tr5c8" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.545280 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-dpld6"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.545322 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.545340 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d44f6ddf-qxthf"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.545354 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tr5c8"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.545366 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f2fdq"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.545398 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-fm2vq"] Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.546263 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.547343 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548045 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bbfdedba-967f-4b86-b7bd-a81854132b50-available-featuregates\") pod \"openshift-config-operator-5777786469-n9g8v\" (UID: \"bbfdedba-967f-4b86-b7bd-a81854132b50\") " pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548167 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/781710ac-8789-42cf-983e-f7de329e4e81-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-p29gv\" (UID: \"781710ac-8789-42cf-983e-f7de329e4e81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548196 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrrqm\" (UniqueName: \"kubernetes.io/projected/781710ac-8789-42cf-983e-f7de329e4e81-kube-api-access-lrrqm\") pod \"kube-storage-version-migrator-operator-565b79b866-p29gv\" (UID: \"781710ac-8789-42cf-983e-f7de329e4e81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548232 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70e9334c-b259-45e5-88a3-6909ce233bda-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548255 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbfdedba-967f-4b86-b7bd-a81854132b50-serving-cert\") pod \"openshift-config-operator-5777786469-n9g8v\" (UID: \"bbfdedba-967f-4b86-b7bd-a81854132b50\") " pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548283 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p76p\" (UniqueName: \"kubernetes.io/projected/6f45ff33-e60b-4885-ac63-5ab182bf6320-kube-api-access-5p76p\") pod \"downloads-747b44746d-t6987\" (UID: \"6f45ff33-e60b-4885-ac63-5ab182bf6320\") " pod="openshift-console/downloads-747b44746d-t6987" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548341 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70e9334c-b259-45e5-88a3-6909ce233bda-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548392 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781710ac-8789-42cf-983e-f7de329e4e81-config\") pod \"kube-storage-version-migrator-operator-565b79b866-p29gv\" (UID: \"781710ac-8789-42cf-983e-f7de329e4e81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548422 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/70e9334c-b259-45e5-88a3-6909ce233bda-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548451 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/70e9334c-b259-45e5-88a3-6909ce233bda-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548473 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sb26\" (UniqueName: \"kubernetes.io/projected/bbfdedba-967f-4b86-b7bd-a81854132b50-kube-api-access-5sb26\") pod \"openshift-config-operator-5777786469-n9g8v\" (UID: \"bbfdedba-967f-4b86-b7bd-a81854132b50\") " pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548521 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/70e9334c-b259-45e5-88a3-6909ce233bda-tmp\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.548562 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scx52\" (UniqueName: \"kubernetes.io/projected/70e9334c-b259-45e5-88a3-6909ce233bda-kube-api-access-scx52\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.550244 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.550292 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70e9334c-b259-45e5-88a3-6909ce233bda-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.551370 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/70e9334c-b259-45e5-88a3-6909ce233bda-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.551454 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/70e9334c-b259-45e5-88a3-6909ce233bda-tmp\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.551897 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bbfdedba-967f-4b86-b7bd-a81854132b50-available-featuregates\") pod \"openshift-config-operator-5777786469-n9g8v\" (UID: \"bbfdedba-967f-4b86-b7bd-a81854132b50\") " pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.552330 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/781710ac-8789-42cf-983e-f7de329e4e81-config\") pod \"kube-storage-version-migrator-operator-565b79b866-p29gv\" (UID: \"781710ac-8789-42cf-983e-f7de329e4e81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.554737 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbfdedba-967f-4b86-b7bd-a81854132b50-serving-cert\") pod \"openshift-config-operator-5777786469-n9g8v\" (UID: \"bbfdedba-967f-4b86-b7bd-a81854132b50\") " pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.559111 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/70e9334c-b259-45e5-88a3-6909ce233bda-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.559771 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/781710ac-8789-42cf-983e-f7de329e4e81-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-p29gv\" (UID: \"781710ac-8789-42cf-983e-f7de329e4e81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.568284 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.588019 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.608905 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-certs-default\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.628790 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.647925 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.668393 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mcc-proxy-tls\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.687018 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-controller-dockercfg-xnj77\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.727719 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-images\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.748163 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mco-proxy-tls\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.767332 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-dockercfg-sw6nc\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.787678 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-admission-controller-secret\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.808613 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ac-dockercfg-gj7jx\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.827658 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-tls\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.848683 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-dockercfg-gnx66\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.868548 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.888590 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-dockercfg-jcmfj\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.908692 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-serving-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.927665 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-config\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.947650 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.967858 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 12 16:52:48 crc kubenswrapper[5184]: I0312 16:52:48.988229 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.007973 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.029001 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.048104 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.068227 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t8n29\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.088130 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-dockercfg-bf7fj\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.108206 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-config\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.128758 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.148857 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-serving-cert\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.169160 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.188828 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-dockercfg-tnfx9\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.208091 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-serving-cert\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.227314 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-config\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.295147 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mj6b\" (UniqueName: \"kubernetes.io/projected/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-kube-api-access-5mj6b\") pod \"oauth-openshift-66458b6674-qmxgv\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.303068 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw42l\" (UniqueName: \"kubernetes.io/projected/027a90bb-52c1-43ed-a43d-6f9755019c9b-kube-api-access-qw42l\") pod \"openshift-apiserver-operator-846cbfc458-mpvq4\" (UID: \"027a90bb-52c1-43ed-a43d-6f9755019c9b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.308624 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-ca-bundle\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.328022 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-dockercfg-4vdnc\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.348538 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-serving-cert\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.367646 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-client\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.388179 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-service-ca-bundle\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.408920 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-config\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.428320 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.446226 5184 request.go:752] "Waited before sending request" delay="1.011285921s" reason="client-side throttling, not priority and fairness" verb="GET" URL="https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.448338 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.492633 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mfvq\" (UniqueName: \"kubernetes.io/projected/17eed63d-a9fc-414e-9c70-347b51893cfa-kube-api-access-7mfvq\") pod \"controller-manager-65b6cccf98-bbnrv\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.512035 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmgq\" (UniqueName: \"kubernetes.io/projected/5db01dce-a574-4dbd-97a9-582f0f357bda-kube-api-access-fsmgq\") pod \"apiserver-8596bd845d-ndv6q\" (UID: \"5db01dce-a574-4dbd-97a9-582f0f357bda\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.531339 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.535097 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdp9g\" (UniqueName: \"kubernetes.io/projected/beb08b86-8593-4511-8bce-ea5f1d44f795-kube-api-access-pdp9g\") pod \"apiserver-9ddfb9f55-j4gpx\" (UID: \"beb08b86-8593-4511-8bce-ea5f1d44f795\") " pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.558154 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqpvw\" (UniqueName: \"kubernetes.io/projected/20de7db3-2a1d-49b2-a756-3ef5b88fbfcc-kube-api-access-xqpvw\") pod \"machine-approver-54c688565-jqlh7\" (UID: \"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.570032 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.575960 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhtqd\" (UniqueName: \"kubernetes.io/projected/1af1595b-1a79-438d-99a0-dd34b32cfcda-kube-api-access-vhtqd\") pod \"dns-operator-799b87ffcd-rqzhf\" (UID: \"1af1595b-1a79-438d-99a0-dd34b32cfcda\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.577493 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.583058 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.592091 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sbfx\" (UniqueName: \"kubernetes.io/projected/36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6-kube-api-access-4sbfx\") pod \"machine-api-operator-755bb95488-d29hz\" (UID: \"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6\") " pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.610596 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrbv2\" (UniqueName: \"kubernetes.io/projected/a1d9df18-d5a1-447d-ad5a-fdef055a830a-kube-api-access-zrbv2\") pod \"route-controller-manager-776cdc94d6-x7z5d\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.631934 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"ingress-operator-dockercfg-74nwh\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.631964 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s92\" (UniqueName: \"kubernetes.io/projected/ea99c433-2166-47f6-8c55-0787f78ff608-kube-api-access-h8s92\") pod \"authentication-operator-7f5c659b84-b2bj4\" (UID: \"ea99c433-2166-47f6-8c55-0787f78ff608\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.648333 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.669780 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.703676 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"trusted-ca\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.710826 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"metrics-tls\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.722270 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.729454 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"openshift-service-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.731620 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.747403 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serving-cert\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.755652 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.769895 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"pprof-cert\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.773171 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.788312 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serviceaccount-dockercfg-4gqzj\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.791647 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" Mar 12 16:52:49 crc kubenswrapper[5184]: W0312 16:52:49.804919 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20de7db3_2a1d_49b2_a756_3ef5b88fbfcc.slice/crio-960405fafdeaf8e5bf45c0e1472b10037e178fb0341a8e853d19f57aa3f353d9 WatchSource:0}: Error finding container 960405fafdeaf8e5bf45c0e1472b10037e178fb0341a8e853d19f57aa3f353d9: Status 404 returned error can't find the container with id 960405fafdeaf8e5bf45c0e1472b10037e178fb0341a8e853d19f57aa3f353d9 Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.807610 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"kube-root-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.826954 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"package-server-manager-serving-cert\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.848101 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"catalog-operator-serving-cert\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.863601 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.868754 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.894369 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bgxvm\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.909635 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.928538 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.950990 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.970011 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-rqzhf"] Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.972251 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.975460 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4"] Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.983310 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q"] Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.984265 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-qmxgv"] Mar 12 16:52:49 crc kubenswrapper[5184]: I0312 16:52:49.987727 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.010694 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kknhg\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.012175 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d"] Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.027665 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.040891 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-bbnrv"] Mar 12 16:52:50 crc kubenswrapper[5184]: W0312 16:52:50.050211 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17eed63d_a9fc_414e_9c70_347b51893cfa.slice/crio-d987a68c425d83fc9e616e2b1a4161702b0b724114eb82206c902e349af33d3c WatchSource:0}: Error finding container d987a68c425d83fc9e616e2b1a4161702b0b724114eb82206c902e349af33d3c: Status 404 returned error can't find the container with id d987a68c425d83fc9e616e2b1a4161702b0b724114eb82206c902e349af33d3c Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.053256 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.067970 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-dockercfg-2cfkp\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.068147 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-j4gpx"] Mar 12 16:52:50 crc kubenswrapper[5184]: W0312 16:52:50.075298 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeb08b86_8593_4511_8bce_ea5f1d44f795.slice/crio-45967288ab0d87f0a0c852d96518dc93553dff7abd27d9d17bce2fc2c1cdbbe6 WatchSource:0}: Error finding container 45967288ab0d87f0a0c852d96518dc93553dff7abd27d9d17bce2fc2c1cdbbe6: Status 404 returned error can't find the container with id 45967288ab0d87f0a0c852d96518dc93553dff7abd27d9d17bce2fc2c1cdbbe6 Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.093287 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"marketplace-trusted-ca\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.099240 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4"] Mar 12 16:52:50 crc kubenswrapper[5184]: W0312 16:52:50.103632 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea99c433_2166_47f6_8c55_0787f78ff608.slice/crio-92a91abcae5603ad2814fa4f7d773243408c8304dee9624262dbe86a80eec80b WatchSource:0}: Error finding container 92a91abcae5603ad2814fa4f7d773243408c8304dee9624262dbe86a80eec80b: Status 404 returned error can't find the container with id 92a91abcae5603ad2814fa4f7d773243408c8304dee9624262dbe86a80eec80b Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.107629 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-metrics\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.127500 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.147345 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.168958 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-dockercfg-dzw6b\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.188114 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-tls\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.207653 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"node-bootstrapper-token\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.228260 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8dkm8\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.245460 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-d29hz"] Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.249694 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 12 16:52:50 crc kubenswrapper[5184]: W0312 16:52:50.255324 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36cf2a72_16f1_4b2c_9d20_9d1ad0af2ce6.slice/crio-7404dc356f5a0e448fd22a331e04fd21fe94ac8615869bf73e124db099a0341f WatchSource:0}: Error finding container 7404dc356f5a0e448fd22a331e04fd21fe94ac8615869bf73e124db099a0341f: Status 404 returned error can't find the container with id 7404dc356f5a0e448fd22a331e04fd21fe94ac8615869bf73e124db099a0341f Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.268326 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.288235 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.308347 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.335246 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.348544 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.368820 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"packageserver-service-cert\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.382833 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" event={"ID":"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc","Type":"ContainerStarted","Data":"960405fafdeaf8e5bf45c0e1472b10037e178fb0341a8e853d19f57aa3f353d9"} Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.384320 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" event={"ID":"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6","Type":"ContainerStarted","Data":"7404dc356f5a0e448fd22a331e04fd21fe94ac8615869bf73e124db099a0341f"} Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.387711 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" event={"ID":"17eed63d-a9fc-414e-9c70-347b51893cfa","Type":"ContainerStarted","Data":"d987a68c425d83fc9e616e2b1a4161702b0b724114eb82206c902e349af33d3c"} Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.387807 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"kube-root-ca.crt\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.390428 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" event={"ID":"ea99c433-2166-47f6-8c55-0787f78ff608","Type":"ContainerStarted","Data":"92a91abcae5603ad2814fa4f7d773243408c8304dee9624262dbe86a80eec80b"} Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.391306 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" event={"ID":"a1d9df18-d5a1-447d-ad5a-fdef055a830a","Type":"ContainerStarted","Data":"9a2061b94300167946e6762a8f804b3b2116ab934a8f7d396ffc2831b382917a"} Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.395017 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" event={"ID":"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc","Type":"ContainerStarted","Data":"b51b95881e509972968ee6a2ac0ba7b59179d74e6db578c1f976730bcb85b110"} Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.396815 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" event={"ID":"027a90bb-52c1-43ed-a43d-6f9755019c9b","Type":"ContainerStarted","Data":"5e899fd5f477fe5ed021c63b521874f755a5ce2c482f383116a3977564b9f0af"} Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.398484 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" event={"ID":"beb08b86-8593-4511-8bce-ea5f1d44f795","Type":"ContainerStarted","Data":"45967288ab0d87f0a0c852d96518dc93553dff7abd27d9d17bce2fc2c1cdbbe6"} Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.407963 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"hostpath-provisioner\"/\"csi-hostpath-provisioner-sa-dockercfg-7dcws\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.411958 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" event={"ID":"1af1595b-1a79-438d-99a0-dd34b32cfcda","Type":"ContainerStarted","Data":"09263464f9061b31cb46636ed107a422ce3b469f6a56c7487a17408dec94a18d"} Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.412019 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" event={"ID":"5db01dce-a574-4dbd-97a9-582f0f357bda","Type":"ContainerStarted","Data":"efe6fbe96109dbf68454417afda7670a1c3bd0ba6b523c66f767347141a699d1"} Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.427668 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"openshift-service-ca.crt\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.447452 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.467521 5184 request.go:752] "Waited before sending request" delay="1.933582891s" reason="client-side throttling, not priority and fairness" verb="GET" URL="https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.470116 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.487714 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.513106 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.527971 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bjqfd\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.548016 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.567678 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.590402 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.607838 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pgs7\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.627625 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.648482 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kpvmz\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.667517 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.704778 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scx52\" (UniqueName: \"kubernetes.io/projected/70e9334c-b259-45e5-88a3-6909ce233bda-kube-api-access-scx52\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.720031 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrrqm\" (UniqueName: \"kubernetes.io/projected/781710ac-8789-42cf-983e-f7de329e4e81-kube-api-access-lrrqm\") pod \"kube-storage-version-migrator-operator-565b79b866-p29gv\" (UID: \"781710ac-8789-42cf-983e-f7de329e4e81\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.727912 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-sysctl-allowlist\"" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.760336 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70e9334c-b259-45e5-88a3-6909ce233bda-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-b6v2k\" (UID: \"70e9334c-b259-45e5-88a3-6909ce233bda\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.781421 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p76p\" (UniqueName: \"kubernetes.io/projected/6f45ff33-e60b-4885-ac63-5ab182bf6320-kube-api-access-5p76p\") pod \"downloads-747b44746d-t6987\" (UID: \"6f45ff33-e60b-4885-ac63-5ab182bf6320\") " pod="openshift-console/downloads-747b44746d-t6987" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.789264 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-747b44746d-t6987" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.803049 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sb26\" (UniqueName: \"kubernetes.io/projected/bbfdedba-967f-4b86-b7bd-a81854132b50-kube-api-access-5sb26\") pod \"openshift-config-operator-5777786469-n9g8v\" (UID: \"bbfdedba-967f-4b86-b7bd-a81854132b50\") " pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.810283 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.828564 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882086 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9e13a3e9-eeee-4c55-a87a-11959e9f7497-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882351 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-trusted-ca\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882368 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt7xx\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-kube-api-access-tt7xx\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882409 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1483fd4-8f3f-4326-874c-19e9c796d809-service-ca-bundle\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882424 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e13a3e9-eeee-4c55-a87a-11959e9f7497-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882465 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8c42229-7663-48d0-a009-893c96840034-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-w2ldh\" (UID: \"e8c42229-7663-48d0-a009-893c96840034\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882493 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxlwb\" (UniqueName: \"kubernetes.io/projected/e8c42229-7663-48d0-a009-893c96840034-kube-api-access-qxlwb\") pod \"machine-config-controller-f9cdd68f7-w2ldh\" (UID: \"e8c42229-7663-48d0-a009-893c96840034\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882508 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d1cd7e2-2062-46cc-8550-b41afd9716f4-trusted-ca\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882523 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8c42229-7663-48d0-a009-893c96840034-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-w2ldh\" (UID: \"e8c42229-7663-48d0-a009-893c96840034\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882541 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1cd7e2-2062-46cc-8550-b41afd9716f4-config\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882556 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e1483fd4-8f3f-4326-874c-19e9c796d809-stats-auth\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882574 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cb76\" (UniqueName: \"kubernetes.io/projected/e1483fd4-8f3f-4326-874c-19e9c796d809-kube-api-access-7cb76\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882591 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82e2099d-a6d8-488e-8144-b2ed728725e2-installation-pull-secrets\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882608 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e1483fd4-8f3f-4326-874c-19e9c796d809-default-certificate\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882635 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82e2099d-a6d8-488e-8144-b2ed728725e2-ca-trust-extracted\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882656 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-tls\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882672 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-bound-sa-token\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882689 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5sfn\" (UniqueName: \"kubernetes.io/projected/640f2f33-9bd1-4378-97fb-61f78501c171-kube-api-access-j5sfn\") pod \"cluster-samples-operator-6b564684c8-5dxkx\" (UID: \"640f2f33-9bd1-4378-97fb-61f78501c171\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882717 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d1cd7e2-2062-46cc-8550-b41afd9716f4-serving-cert\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882732 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1483fd4-8f3f-4326-874c-19e9c796d809-metrics-certs\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882750 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e13a3e9-eeee-4c55-a87a-11959e9f7497-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882786 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-certificates\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882800 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e13a3e9-eeee-4c55-a87a-11959e9f7497-config\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882837 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882854 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwmf8\" (UniqueName: \"kubernetes.io/projected/8d1cd7e2-2062-46cc-8550-b41afd9716f4-kube-api-access-kwmf8\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.882868 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/640f2f33-9bd1-4378-97fb-61f78501c171-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-5dxkx\" (UID: \"640f2f33-9bd1-4378-97fb-61f78501c171\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" Mar 12 16:52:50 crc kubenswrapper[5184]: E0312 16:52:50.883205 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:51.383193329 +0000 UTC m=+113.924504668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.892913 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.983776 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984071 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxlwb\" (UniqueName: \"kubernetes.io/projected/e8c42229-7663-48d0-a009-893c96840034-kube-api-access-qxlwb\") pod \"machine-config-controller-f9cdd68f7-w2ldh\" (UID: \"e8c42229-7663-48d0-a009-893c96840034\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984117 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d980efcf-1159-448a-ac6c-4ee5ddff2b66-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-f99dz\" (UID: \"d980efcf-1159-448a-ac6c-4ee5ddff2b66\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984149 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0414ebc4-50d1-4b9e-966a-693c0957a5a5-webhook-certs\") pod \"multus-admission-controller-69db94689b-l8cgq\" (UID: \"0414ebc4-50d1-4b9e-966a-693c0957a5a5\") " pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984173 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/935f19f6-af87-48ad-bd81-641676250fdd-config\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984195 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcac2a22-f863-4ebe-8e3c-b88664a6c14d-config\") pod \"service-ca-operator-5b9c976747-dzzxj\" (UID: \"dcac2a22-f863-4ebe-8e3c-b88664a6c14d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984227 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/38127f6e-375c-4fe3-9070-1a9da91aa12f-srv-cert\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984250 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jczbt\" (UniqueName: \"kubernetes.io/projected/dcac2a22-f863-4ebe-8e3c-b88664a6c14d-kube-api-access-jczbt\") pod \"service-ca-operator-5b9c976747-dzzxj\" (UID: \"dcac2a22-f863-4ebe-8e3c-b88664a6c14d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984269 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-service-ca\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984288 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfpzx\" (UniqueName: \"kubernetes.io/projected/2da0aa85-6bbd-4fc9-b76a-00f1e51f8327-kube-api-access-gfpzx\") pod \"migrator-866fcbc849-csf6b\" (UID: \"2da0aa85-6bbd-4fc9-b76a-00f1e51f8327\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984308 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll65m\" (UniqueName: \"kubernetes.io/projected/caec81e5-958d-4139-aba7-2a5df11c25b1-kube-api-access-ll65m\") pod \"machine-config-server-kv7dd\" (UID: \"caec81e5-958d-4139-aba7-2a5df11c25b1\") " pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984327 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c94tq\" (UniqueName: \"kubernetes.io/projected/d980efcf-1159-448a-ac6c-4ee5ddff2b66-kube-api-access-c94tq\") pod \"package-server-manager-77f986bd66-f99dz\" (UID: \"d980efcf-1159-448a-ac6c-4ee5ddff2b66\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984349 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ad036a8-381e-4761-a20f-8d8b9a3e9408-tmp\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984387 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e1483fd4-8f3f-4326-874c-19e9c796d809-stats-auth\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984409 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/38127f6e-375c-4fe3-9070-1a9da91aa12f-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984437 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e947b0fa-3e07-4965-b693-8857cd4b98fd-tmp-dir\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984459 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82e2099d-a6d8-488e-8144-b2ed728725e2-installation-pull-secrets\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984478 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e1483fd4-8f3f-4326-874c-19e9c796d809-default-certificate\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984497 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-config\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984514 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-oauth-serving-cert\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984537 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqbg2\" (UniqueName: \"kubernetes.io/projected/935f19f6-af87-48ad-bd81-641676250fdd-kube-api-access-mqbg2\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984559 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9d7d9b-02cb-4871-aaff-673af3457aa4-config\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984588 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0266bd06-b813-4f14-b94a-24d31805b311-config\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984613 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4acc340c-81c7-4011-b17f-9f83eadd540e-config\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984639 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsfb2\" (UniqueName: \"kubernetes.io/projected/4840f833-3dce-444b-8cad-3a7374af30e7-kube-api-access-nsfb2\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984668 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82e2099d-a6d8-488e-8144-b2ed728725e2-ca-trust-extracted\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984691 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/935f19f6-af87-48ad-bd81-641676250fdd-tmp-dir\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984715 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1fa45036-e34b-4f40-9e02-838ed397f42e-srv-cert\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984736 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-plugins-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984756 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/935f19f6-af87-48ad-bd81-641676250fdd-etcd-client\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984798 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1483fd4-8f3f-4326-874c-19e9c796d809-metrics-certs\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984821 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-oauth-config\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984844 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b297812d-1aca-496c-a83e-72f4d8b54415-apiservice-cert\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984899 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d1cd7e2-2062-46cc-8550-b41afd9716f4-serving-cert\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984924 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4840f833-3dce-444b-8cad-3a7374af30e7-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984948 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1fa45036-e34b-4f40-9e02-838ed397f42e-tmpfs\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984969 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-serving-cert\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.984990 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/38127f6e-375c-4fe3-9070-1a9da91aa12f-tmpfs\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985014 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b85787-d6d2-48df-9830-ca4532adee38-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m7sz7\" (UID: \"b6b85787-d6d2-48df-9830-ca4532adee38\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985051 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xwwk\" (UniqueName: \"kubernetes.io/projected/c42a2703-d32e-41a7-accf-68b6e5d8c000-kube-api-access-8xwwk\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985071 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985093 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkkfq\" (UniqueName: \"kubernetes.io/projected/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-kube-api-access-dkkfq\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985112 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqfjp\" (UniqueName: \"kubernetes.io/projected/73142984-30ee-40e5-8fdd-d024df118964-kube-api-access-vqfjp\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985145 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e947b0fa-3e07-4965-b693-8857cd4b98fd-metrics-tls\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985215 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22a20570-e6d3-4f7b-b8fd-bc3cf5716448-signing-key\") pod \"service-ca-74545575db-6qpvf\" (UID: \"22a20570-e6d3-4f7b-b8fd-bc3cf5716448\") " pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985253 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9rvr\" (UniqueName: \"kubernetes.io/projected/b297812d-1aca-496c-a83e-72f4d8b54415-kube-api-access-b9rvr\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985289 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-certificates\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985308 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22a20570-e6d3-4f7b-b8fd-bc3cf5716448-signing-cabundle\") pod \"service-ca-74545575db-6qpvf\" (UID: \"22a20570-e6d3-4f7b-b8fd-bc3cf5716448\") " pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985329 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7shxv\" (UniqueName: \"kubernetes.io/projected/4acc340c-81c7-4011-b17f-9f83eadd540e-kube-api-access-7shxv\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985355 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/640f2f33-9bd1-4378-97fb-61f78501c171-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-5dxkx\" (UID: \"640f2f33-9bd1-4378-97fb-61f78501c171\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985419 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/935f19f6-af87-48ad-bd81-641676250fdd-etcd-service-ca\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985564 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrzv6\" (UniqueName: \"kubernetes.io/projected/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-kube-api-access-wrzv6\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985581 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcac2a22-f863-4ebe-8e3c-b88664a6c14d-serving-cert\") pod \"service-ca-operator-5b9c976747-dzzxj\" (UID: \"dcac2a22-f863-4ebe-8e3c-b88664a6c14d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985602 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t644v\" (UniqueName: \"kubernetes.io/projected/22a20570-e6d3-4f7b-b8fd-bc3cf5716448-kube-api-access-t644v\") pod \"service-ca-74545575db-6qpvf\" (UID: \"22a20570-e6d3-4f7b-b8fd-bc3cf5716448\") " pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985653 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-trusted-ca\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.985929 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82e2099d-a6d8-488e-8144-b2ed728725e2-ca-trust-extracted\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.988715 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-certificates\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.988947 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-trusted-ca\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.989011 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-trusted-ca-bundle\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.989047 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1483fd4-8f3f-4326-874c-19e9c796d809-service-ca-bundle\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.989115 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/73142984-30ee-40e5-8fdd-d024df118964-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.989754 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1483fd4-8f3f-4326-874c-19e9c796d809-service-ca-bundle\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: E0312 16:52:50.989797 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:51.489782256 +0000 UTC m=+114.031093595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.989820 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e947b0fa-3e07-4965-b693-8857cd4b98fd-config-volume\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.989847 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.989868 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.989896 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d1cd7e2-2062-46cc-8550-b41afd9716f4-trusted-ca\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.989915 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ea329fb-a095-4645-b64f-a5769efa6364-secret-volume\") pod \"collect-profiles-29555565-ms5vz\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.990649 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d1cd7e2-2062-46cc-8550-b41afd9716f4-trusted-ca\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.990689 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8c42229-7663-48d0-a009-893c96840034-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-w2ldh\" (UID: \"e8c42229-7663-48d0-a009-893c96840034\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.990731 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d1cd7e2-2062-46cc-8550-b41afd9716f4-serving-cert\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.990782 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-registration-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.990824 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1cd7e2-2062-46cc-8550-b41afd9716f4-config\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.990857 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0266bd06-b813-4f14-b94a-24d31805b311-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.990877 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjph9\" (UniqueName: \"kubernetes.io/projected/e947b0fa-3e07-4965-b693-8857cd4b98fd-kube-api-access-xjph9\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.990903 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b297812d-1aca-496c-a83e-72f4d8b54415-webhook-cert\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.990922 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73142984-30ee-40e5-8fdd-d024df118964-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991305 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cb76\" (UniqueName: \"kubernetes.io/projected/e1483fd4-8f3f-4326-874c-19e9c796d809-kube-api-access-7cb76\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991351 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e1483fd4-8f3f-4326-874c-19e9c796d809-stats-auth\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991462 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/935f19f6-af87-48ad-bd81-641676250fdd-etcd-ca\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991483 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d1cd7e2-2062-46cc-8550-b41afd9716f4-config\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991531 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b297812d-1aca-496c-a83e-72f4d8b54415-tmpfs\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991557 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991574 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ea329fb-a095-4645-b64f-a5769efa6364-config-volume\") pod \"collect-profiles-29555565-ms5vz\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991597 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4840f833-3dce-444b-8cad-3a7374af30e7-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991644 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/caec81e5-958d-4139-aba7-2a5df11c25b1-node-bootstrap-token\") pod \"machine-config-server-kv7dd\" (UID: \"caec81e5-958d-4139-aba7-2a5df11c25b1\") " pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991702 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-mountpoint-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991737 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-tls\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991764 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-bound-sa-token\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991780 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5sfn\" (UniqueName: \"kubernetes.io/projected/640f2f33-9bd1-4378-97fb-61f78501c171-kube-api-access-j5sfn\") pod \"cluster-samples-operator-6b564684c8-5dxkx\" (UID: \"640f2f33-9bd1-4378-97fb-61f78501c171\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991808 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0266bd06-b813-4f14-b94a-24d31805b311-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.991824 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06f8d9b-0e08-4f09-af38-cd987ab002f9-cert\") pod \"ingress-canary-tr5c8\" (UID: \"f06f8d9b-0e08-4f09-af38-cd987ab002f9\") " pod="openshift-ingress-canary/ingress-canary-tr5c8" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.992170 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0266bd06-b813-4f14-b94a-24d31805b311-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.992190 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfz9t\" (UniqueName: \"kubernetes.io/projected/5ad036a8-381e-4761-a20f-8d8b9a3e9408-kube-api-access-kfz9t\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.992230 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e13a3e9-eeee-4c55-a87a-11959e9f7497-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.992387 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqpt6\" (UniqueName: \"kubernetes.io/projected/f06f8d9b-0e08-4f09-af38-cd987ab002f9-kube-api-access-zqpt6\") pod \"ingress-canary-tr5c8\" (UID: \"f06f8d9b-0e08-4f09-af38-cd987ab002f9\") " pod="openshift-ingress-canary/ingress-canary-tr5c8" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.992737 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b9d7d9b-02cb-4871-aaff-673af3457aa4-serving-cert\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.993045 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e13a3e9-eeee-4c55-a87a-11959e9f7497-config\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.993114 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxb7j\" (UniqueName: \"kubernetes.io/projected/1fa45036-e34b-4f40-9e02-838ed397f42e-kube-api-access-jxb7j\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.993142 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935f19f6-af87-48ad-bd81-641676250fdd-serving-cert\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.993255 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/640f2f33-9bd1-4378-97fb-61f78501c171-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-5dxkx\" (UID: \"640f2f33-9bd1-4378-97fb-61f78501c171\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.993458 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zglg7\" (UniqueName: \"kubernetes.io/projected/8ea329fb-a095-4645-b64f-a5769efa6364-kube-api-access-zglg7\") pod \"collect-profiles-29555565-ms5vz\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.993583 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwmf8\" (UniqueName: \"kubernetes.io/projected/8d1cd7e2-2062-46cc-8550-b41afd9716f4-kube-api-access-kwmf8\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.993705 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7bnc\" (UniqueName: \"kubernetes.io/projected/0414ebc4-50d1-4b9e-966a-693c0957a5a5-kube-api-access-f7bnc\") pod \"multus-admission-controller-69db94689b-l8cgq\" (UID: \"0414ebc4-50d1-4b9e-966a-693c0957a5a5\") " pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.993986 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4acc340c-81c7-4011-b17f-9f83eadd540e-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.994085 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8b9d7d9b-02cb-4871-aaff-673af3457aa4-tmp-dir\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.994137 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts5rb\" (UniqueName: \"kubernetes.io/projected/38127f6e-375c-4fe3-9070-1a9da91aa12f-kube-api-access-ts5rb\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.994173 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e13a3e9-eeee-4c55-a87a-11959e9f7497-config\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.995915 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82e2099d-a6d8-488e-8144-b2ed728725e2-installation-pull-secrets\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.995983 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/73142984-30ee-40e5-8fdd-d024df118964-images\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996014 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996053 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9e13a3e9-eeee-4c55-a87a-11959e9f7497-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996077 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1fa45036-e34b-4f40-9e02-838ed397f42e-profile-collector-cert\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996098 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b9d7d9b-02cb-4871-aaff-673af3457aa4-kube-api-access\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996124 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/caec81e5-958d-4139-aba7-2a5df11c25b1-certs\") pod \"machine-config-server-kv7dd\" (UID: \"caec81e5-958d-4139-aba7-2a5df11c25b1\") " pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996159 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tt7xx\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-kube-api-access-tt7xx\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996184 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-csi-data-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996211 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgt2p\" (UniqueName: \"kubernetes.io/projected/b6b85787-d6d2-48df-9830-ca4532adee38-kube-api-access-tgt2p\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m7sz7\" (UID: \"b6b85787-d6d2-48df-9830-ca4532adee38\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996240 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4840f833-3dce-444b-8cad-3a7374af30e7-ready\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996269 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-socket-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996298 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e13a3e9-eeee-4c55-a87a-11959e9f7497-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996324 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4acc340c-81c7-4011-b17f-9f83eadd540e-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996416 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8c42229-7663-48d0-a009-893c96840034-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-w2ldh\" (UID: \"e8c42229-7663-48d0-a009-893c96840034\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996539 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1483fd4-8f3f-4326-874c-19e9c796d809-metrics-certs\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.996577 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e1483fd4-8f3f-4326-874c-19e9c796d809-default-certificate\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.997430 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9e13a3e9-eeee-4c55-a87a-11959e9f7497-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.997677 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-tls\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.998013 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-747b44746d-t6987"] Mar 12 16:52:50 crc kubenswrapper[5184]: I0312 16:52:50.999139 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8c42229-7663-48d0-a009-893c96840034-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-w2ldh\" (UID: \"e8c42229-7663-48d0-a009-893c96840034\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.002792 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e13a3e9-eeee-4c55-a87a-11959e9f7497-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.027970 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxlwb\" (UniqueName: \"kubernetes.io/projected/e8c42229-7663-48d0-a009-893c96840034-kube-api-access-qxlwb\") pod \"machine-config-controller-f9cdd68f7-w2ldh\" (UID: \"e8c42229-7663-48d0-a009-893c96840034\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.028309 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k"] Mar 12 16:52:51 crc kubenswrapper[5184]: W0312 16:52:51.049343 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70e9334c_b259_45e5_88a3_6909ce233bda.slice/crio-5174e920dc8c67587e4d15592b32d823c44ca0711dc608fcbabdbbdc442221ee WatchSource:0}: Error finding container 5174e920dc8c67587e4d15592b32d823c44ca0711dc608fcbabdbbdc442221ee: Status 404 returned error can't find the container with id 5174e920dc8c67587e4d15592b32d823c44ca0711dc608fcbabdbbdc442221ee Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.077704 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.079828 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cb76\" (UniqueName: \"kubernetes.io/projected/e1483fd4-8f3f-4326-874c-19e9c796d809-kube-api-access-7cb76\") pod \"router-default-68cf44c8b8-7pgjs\" (UID: \"e1483fd4-8f3f-4326-874c-19e9c796d809\") " pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.089695 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5sfn\" (UniqueName: \"kubernetes.io/projected/640f2f33-9bd1-4378-97fb-61f78501c171-kube-api-access-j5sfn\") pod \"cluster-samples-operator-6b564684c8-5dxkx\" (UID: \"640f2f33-9bd1-4378-97fb-61f78501c171\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.097961 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/caec81e5-958d-4139-aba7-2a5df11c25b1-certs\") pod \"machine-config-server-kv7dd\" (UID: \"caec81e5-958d-4139-aba7-2a5df11c25b1\") " pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.097995 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-csi-data-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098014 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgt2p\" (UniqueName: \"kubernetes.io/projected/b6b85787-d6d2-48df-9830-ca4532adee38-kube-api-access-tgt2p\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m7sz7\" (UID: \"b6b85787-d6d2-48df-9830-ca4532adee38\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098031 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4840f833-3dce-444b-8cad-3a7374af30e7-ready\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098047 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-socket-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098064 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4acc340c-81c7-4011-b17f-9f83eadd540e-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098090 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d980efcf-1159-448a-ac6c-4ee5ddff2b66-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-f99dz\" (UID: \"d980efcf-1159-448a-ac6c-4ee5ddff2b66\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098110 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0414ebc4-50d1-4b9e-966a-693c0957a5a5-webhook-certs\") pod \"multus-admission-controller-69db94689b-l8cgq\" (UID: \"0414ebc4-50d1-4b9e-966a-693c0957a5a5\") " pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098125 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/935f19f6-af87-48ad-bd81-641676250fdd-config\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098141 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcac2a22-f863-4ebe-8e3c-b88664a6c14d-config\") pod \"service-ca-operator-5b9c976747-dzzxj\" (UID: \"dcac2a22-f863-4ebe-8e3c-b88664a6c14d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098159 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/38127f6e-375c-4fe3-9070-1a9da91aa12f-srv-cert\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098174 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jczbt\" (UniqueName: \"kubernetes.io/projected/dcac2a22-f863-4ebe-8e3c-b88664a6c14d-kube-api-access-jczbt\") pod \"service-ca-operator-5b9c976747-dzzxj\" (UID: \"dcac2a22-f863-4ebe-8e3c-b88664a6c14d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098192 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-service-ca\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098210 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfpzx\" (UniqueName: \"kubernetes.io/projected/2da0aa85-6bbd-4fc9-b76a-00f1e51f8327-kube-api-access-gfpzx\") pod \"migrator-866fcbc849-csf6b\" (UID: \"2da0aa85-6bbd-4fc9-b76a-00f1e51f8327\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098226 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll65m\" (UniqueName: \"kubernetes.io/projected/caec81e5-958d-4139-aba7-2a5df11c25b1-kube-api-access-ll65m\") pod \"machine-config-server-kv7dd\" (UID: \"caec81e5-958d-4139-aba7-2a5df11c25b1\") " pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098242 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c94tq\" (UniqueName: \"kubernetes.io/projected/d980efcf-1159-448a-ac6c-4ee5ddff2b66-kube-api-access-c94tq\") pod \"package-server-manager-77f986bd66-f99dz\" (UID: \"d980efcf-1159-448a-ac6c-4ee5ddff2b66\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098257 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ad036a8-381e-4761-a20f-8d8b9a3e9408-tmp\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098274 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/38127f6e-375c-4fe3-9070-1a9da91aa12f-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098302 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e947b0fa-3e07-4965-b693-8857cd4b98fd-tmp-dir\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098319 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-config\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098333 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-oauth-serving-cert\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098352 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqbg2\" (UniqueName: \"kubernetes.io/projected/935f19f6-af87-48ad-bd81-641676250fdd-kube-api-access-mqbg2\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098368 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9d7d9b-02cb-4871-aaff-673af3457aa4-config\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098398 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0266bd06-b813-4f14-b94a-24d31805b311-config\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098414 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4acc340c-81c7-4011-b17f-9f83eadd540e-config\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098428 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsfb2\" (UniqueName: \"kubernetes.io/projected/4840f833-3dce-444b-8cad-3a7374af30e7-kube-api-access-nsfb2\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098446 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/935f19f6-af87-48ad-bd81-641676250fdd-tmp-dir\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098463 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1fa45036-e34b-4f40-9e02-838ed397f42e-srv-cert\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098479 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-plugins-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098494 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/935f19f6-af87-48ad-bd81-641676250fdd-etcd-client\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098513 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-oauth-config\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098531 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b297812d-1aca-496c-a83e-72f4d8b54415-apiservice-cert\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098557 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4840f833-3dce-444b-8cad-3a7374af30e7-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098573 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1fa45036-e34b-4f40-9e02-838ed397f42e-tmpfs\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098586 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-serving-cert\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098600 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/38127f6e-375c-4fe3-9070-1a9da91aa12f-tmpfs\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098617 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b85787-d6d2-48df-9830-ca4532adee38-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m7sz7\" (UID: \"b6b85787-d6d2-48df-9830-ca4532adee38\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098638 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xwwk\" (UniqueName: \"kubernetes.io/projected/c42a2703-d32e-41a7-accf-68b6e5d8c000-kube-api-access-8xwwk\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098654 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098669 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkkfq\" (UniqueName: \"kubernetes.io/projected/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-kube-api-access-dkkfq\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098685 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqfjp\" (UniqueName: \"kubernetes.io/projected/73142984-30ee-40e5-8fdd-d024df118964-kube-api-access-vqfjp\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098701 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e947b0fa-3e07-4965-b693-8857cd4b98fd-metrics-tls\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098719 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22a20570-e6d3-4f7b-b8fd-bc3cf5716448-signing-key\") pod \"service-ca-74545575db-6qpvf\" (UID: \"22a20570-e6d3-4f7b-b8fd-bc3cf5716448\") " pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098734 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9rvr\" (UniqueName: \"kubernetes.io/projected/b297812d-1aca-496c-a83e-72f4d8b54415-kube-api-access-b9rvr\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098749 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22a20570-e6d3-4f7b-b8fd-bc3cf5716448-signing-cabundle\") pod \"service-ca-74545575db-6qpvf\" (UID: \"22a20570-e6d3-4f7b-b8fd-bc3cf5716448\") " pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098765 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7shxv\" (UniqueName: \"kubernetes.io/projected/4acc340c-81c7-4011-b17f-9f83eadd540e-kube-api-access-7shxv\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098787 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098806 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/935f19f6-af87-48ad-bd81-641676250fdd-etcd-service-ca\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098827 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrzv6\" (UniqueName: \"kubernetes.io/projected/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-kube-api-access-wrzv6\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098843 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcac2a22-f863-4ebe-8e3c-b88664a6c14d-serving-cert\") pod \"service-ca-operator-5b9c976747-dzzxj\" (UID: \"dcac2a22-f863-4ebe-8e3c-b88664a6c14d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098861 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t644v\" (UniqueName: \"kubernetes.io/projected/22a20570-e6d3-4f7b-b8fd-bc3cf5716448-kube-api-access-t644v\") pod \"service-ca-74545575db-6qpvf\" (UID: \"22a20570-e6d3-4f7b-b8fd-bc3cf5716448\") " pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098888 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-trusted-ca-bundle\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098908 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/73142984-30ee-40e5-8fdd-d024df118964-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098927 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e947b0fa-3e07-4965-b693-8857cd4b98fd-config-volume\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098943 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098961 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098978 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ea329fb-a095-4645-b64f-a5769efa6364-secret-volume\") pod \"collect-profiles-29555565-ms5vz\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.098996 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-registration-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099014 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0266bd06-b813-4f14-b94a-24d31805b311-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099029 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjph9\" (UniqueName: \"kubernetes.io/projected/e947b0fa-3e07-4965-b693-8857cd4b98fd-kube-api-access-xjph9\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099043 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b297812d-1aca-496c-a83e-72f4d8b54415-webhook-cert\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099058 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73142984-30ee-40e5-8fdd-d024df118964-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099077 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/935f19f6-af87-48ad-bd81-641676250fdd-etcd-ca\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099093 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b297812d-1aca-496c-a83e-72f4d8b54415-tmpfs\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099110 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099124 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ea329fb-a095-4645-b64f-a5769efa6364-config-volume\") pod \"collect-profiles-29555565-ms5vz\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099139 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4840f833-3dce-444b-8cad-3a7374af30e7-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099165 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/caec81e5-958d-4139-aba7-2a5df11c25b1-node-bootstrap-token\") pod \"machine-config-server-kv7dd\" (UID: \"caec81e5-958d-4139-aba7-2a5df11c25b1\") " pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099187 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-mountpoint-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099208 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0266bd06-b813-4f14-b94a-24d31805b311-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099223 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06f8d9b-0e08-4f09-af38-cd987ab002f9-cert\") pod \"ingress-canary-tr5c8\" (UID: \"f06f8d9b-0e08-4f09-af38-cd987ab002f9\") " pod="openshift-ingress-canary/ingress-canary-tr5c8" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099242 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0266bd06-b813-4f14-b94a-24d31805b311-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099258 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfz9t\" (UniqueName: \"kubernetes.io/projected/5ad036a8-381e-4761-a20f-8d8b9a3e9408-kube-api-access-kfz9t\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099280 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqpt6\" (UniqueName: \"kubernetes.io/projected/f06f8d9b-0e08-4f09-af38-cd987ab002f9-kube-api-access-zqpt6\") pod \"ingress-canary-tr5c8\" (UID: \"f06f8d9b-0e08-4f09-af38-cd987ab002f9\") " pod="openshift-ingress-canary/ingress-canary-tr5c8" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099308 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b9d7d9b-02cb-4871-aaff-673af3457aa4-serving-cert\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099337 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxb7j\" (UniqueName: \"kubernetes.io/projected/1fa45036-e34b-4f40-9e02-838ed397f42e-kube-api-access-jxb7j\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099352 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935f19f6-af87-48ad-bd81-641676250fdd-serving-cert\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099368 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zglg7\" (UniqueName: \"kubernetes.io/projected/8ea329fb-a095-4645-b64f-a5769efa6364-kube-api-access-zglg7\") pod \"collect-profiles-29555565-ms5vz\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099400 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7bnc\" (UniqueName: \"kubernetes.io/projected/0414ebc4-50d1-4b9e-966a-693c0957a5a5-kube-api-access-f7bnc\") pod \"multus-admission-controller-69db94689b-l8cgq\" (UID: \"0414ebc4-50d1-4b9e-966a-693c0957a5a5\") " pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099426 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4acc340c-81c7-4011-b17f-9f83eadd540e-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099441 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8b9d7d9b-02cb-4871-aaff-673af3457aa4-tmp-dir\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099458 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts5rb\" (UniqueName: \"kubernetes.io/projected/38127f6e-375c-4fe3-9070-1a9da91aa12f-kube-api-access-ts5rb\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099473 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/73142984-30ee-40e5-8fdd-d024df118964-images\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099492 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099509 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1fa45036-e34b-4f40-9e02-838ed397f42e-profile-collector-cert\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.099527 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b9d7d9b-02cb-4871-aaff-673af3457aa4-kube-api-access\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.100524 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-csi-data-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.100953 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4840f833-3dce-444b-8cad-3a7374af30e7-ready\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.101075 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b297812d-1aca-496c-a83e-72f4d8b54415-tmpfs\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.101285 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-socket-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.101919 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22a20570-e6d3-4f7b-b8fd-bc3cf5716448-signing-cabundle\") pod \"service-ca-74545575db-6qpvf\" (UID: \"22a20570-e6d3-4f7b-b8fd-bc3cf5716448\") " pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.102278 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:51.602185193 +0000 UTC m=+114.143496532 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.102690 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.102931 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcac2a22-f863-4ebe-8e3c-b88664a6c14d-config\") pod \"service-ca-operator-5b9c976747-dzzxj\" (UID: \"dcac2a22-f863-4ebe-8e3c-b88664a6c14d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.103037 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ad036a8-381e-4761-a20f-8d8b9a3e9408-tmp\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.103141 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/935f19f6-af87-48ad-bd81-641676250fdd-etcd-service-ca\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.103781 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4acc340c-81c7-4011-b17f-9f83eadd540e-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.103927 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-plugins-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.104354 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9d7d9b-02cb-4871-aaff-673af3457aa4-config\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.104605 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.104831 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-trusted-ca-bundle\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.104974 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ea329fb-a095-4645-b64f-a5769efa6364-config-volume\") pod \"collect-profiles-29555565-ms5vz\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.105032 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4840f833-3dce-444b-8cad-3a7374af30e7-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.105960 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-mountpoint-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.106324 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22a20570-e6d3-4f7b-b8fd-bc3cf5716448-signing-key\") pod \"service-ca-74545575db-6qpvf\" (UID: \"22a20570-e6d3-4f7b-b8fd-bc3cf5716448\") " pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.106981 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-bound-sa-token\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.107497 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4acc340c-81c7-4011-b17f-9f83eadd540e-config\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.107502 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcac2a22-f863-4ebe-8e3c-b88664a6c14d-serving-cert\") pod \"service-ca-operator-5b9c976747-dzzxj\" (UID: \"dcac2a22-f863-4ebe-8e3c-b88664a6c14d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.107734 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b297812d-1aca-496c-a83e-72f4d8b54415-apiservice-cert\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.107742 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e947b0fa-3e07-4965-b693-8857cd4b98fd-config-volume\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.107941 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e947b0fa-3e07-4965-b693-8857cd4b98fd-tmp-dir\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.108196 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d980efcf-1159-448a-ac6c-4ee5ddff2b66-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-f99dz\" (UID: \"d980efcf-1159-448a-ac6c-4ee5ddff2b66\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.108426 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0414ebc4-50d1-4b9e-966a-693c0957a5a5-webhook-certs\") pod \"multus-admission-controller-69db94689b-l8cgq\" (UID: \"0414ebc4-50d1-4b9e-966a-693c0957a5a5\") " pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.108617 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4840f833-3dce-444b-8cad-3a7374af30e7-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.108702 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-registration-dir\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.109144 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-service-ca\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.109336 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0266bd06-b813-4f14-b94a-24d31805b311-config\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.109603 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/38127f6e-375c-4fe3-9070-1a9da91aa12f-tmpfs\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.109672 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-config\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.109929 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.109956 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-oauth-serving-cert\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.110564 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1fa45036-e34b-4f40-9e02-838ed397f42e-tmpfs\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.110666 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.111230 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.111416 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-serving-cert\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.111804 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8b9d7d9b-02cb-4871-aaff-673af3457aa4-tmp-dir\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.111892 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4acc340c-81c7-4011-b17f-9f83eadd540e-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.112318 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/73142984-30ee-40e5-8fdd-d024df118964-images\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.112419 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e947b0fa-3e07-4965-b693-8857cd4b98fd-metrics-tls\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.112630 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-oauth-config\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.113075 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6b85787-d6d2-48df-9830-ca4532adee38-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m7sz7\" (UID: \"b6b85787-d6d2-48df-9830-ca4532adee38\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.113527 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73142984-30ee-40e5-8fdd-d024df118964-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.113583 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ea329fb-a095-4645-b64f-a5769efa6364-secret-volume\") pod \"collect-profiles-29555565-ms5vz\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.113745 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b9d7d9b-02cb-4871-aaff-673af3457aa4-serving-cert\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.114257 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06f8d9b-0e08-4f09-af38-cd987ab002f9-cert\") pod \"ingress-canary-tr5c8\" (UID: \"f06f8d9b-0e08-4f09-af38-cd987ab002f9\") " pod="openshift-ingress-canary/ingress-canary-tr5c8" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.114544 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/caec81e5-958d-4139-aba7-2a5df11c25b1-node-bootstrap-token\") pod \"machine-config-server-kv7dd\" (UID: \"caec81e5-958d-4139-aba7-2a5df11c25b1\") " pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.114983 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/38127f6e-375c-4fe3-9070-1a9da91aa12f-srv-cert\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.115249 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/38127f6e-375c-4fe3-9070-1a9da91aa12f-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.115297 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1fa45036-e34b-4f40-9e02-838ed397f42e-profile-collector-cert\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.118216 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0266bd06-b813-4f14-b94a-24d31805b311-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.118309 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b297812d-1aca-496c-a83e-72f4d8b54415-webhook-cert\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.122369 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwmf8\" (UniqueName: \"kubernetes.io/projected/8d1cd7e2-2062-46cc-8550-b41afd9716f4-kube-api-access-kwmf8\") pod \"console-operator-67c89758df-ssqr6\" (UID: \"8d1cd7e2-2062-46cc-8550-b41afd9716f4\") " pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.134009 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-n9g8v"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.143464 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt7xx\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-kube-api-access-tt7xx\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.160103 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e13a3e9-eeee-4c55-a87a-11959e9f7497-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-fnw98\" (UID: \"9e13a3e9-eeee-4c55-a87a-11959e9f7497\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.185432 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.198807 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.200333 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.200688 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:51.700660338 +0000 UTC m=+114.241971677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.200743 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b9d7d9b-02cb-4871-aaff-673af3457aa4-kube-api-access\") pod \"kube-apiserver-operator-575994946d-d8dpv\" (UID: \"8b9d7d9b-02cb-4871-aaff-673af3457aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.200977 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.201774 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:51.701765492 +0000 UTC m=+114.243076831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.202320 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0266bd06-b813-4f14-b94a-24d31805b311-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.202579 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/caec81e5-958d-4139-aba7-2a5df11c25b1-certs\") pod \"machine-config-server-kv7dd\" (UID: \"caec81e5-958d-4139-aba7-2a5df11c25b1\") " pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.202729 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/73142984-30ee-40e5-8fdd-d024df118964-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.203834 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/935f19f6-af87-48ad-bd81-641676250fdd-tmp-dir\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.203997 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1fa45036-e34b-4f40-9e02-838ed397f42e-srv-cert\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.204082 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8c42229-7663-48d0-a009-893c96840034-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-w2ldh\" (UID: \"e8c42229-7663-48d0-a009-893c96840034\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.204473 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935f19f6-af87-48ad-bd81-641676250fdd-serving-cert\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.204609 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.206514 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/935f19f6-af87-48ad-bd81-641676250fdd-config\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.207088 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/935f19f6-af87-48ad-bd81-641676250fdd-etcd-ca\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.207211 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/935f19f6-af87-48ad-bd81-641676250fdd-etcd-client\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.221117 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqbg2\" (UniqueName: \"kubernetes.io/projected/935f19f6-af87-48ad-bd81-641676250fdd-kube-api-access-mqbg2\") pod \"etcd-operator-69b85846b6-5dwhg\" (UID: \"935f19f6-af87-48ad-bd81-641676250fdd\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: W0312 16:52:51.233157 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1483fd4_8f3f_4326_874c_19e9c796d809.slice/crio-a92abe48f21b239a60e4a01566d22d1eb9e15fb474a7e459518e0bc45df961dd WatchSource:0}: Error finding container a92abe48f21b239a60e4a01566d22d1eb9e15fb474a7e459518e0bc45df961dd: Status 404 returned error can't find the container with id a92abe48f21b239a60e4a01566d22d1eb9e15fb474a7e459518e0bc45df961dd Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.245741 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgt2p\" (UniqueName: \"kubernetes.io/projected/b6b85787-d6d2-48df-9830-ca4532adee38-kube-api-access-tgt2p\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m7sz7\" (UID: \"b6b85787-d6d2-48df-9830-ca4532adee38\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.262025 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9rvr\" (UniqueName: \"kubernetes.io/projected/b297812d-1aca-496c-a83e-72f4d8b54415-kube-api-access-b9rvr\") pod \"packageserver-7d4fc7d867-6bgzr\" (UID: \"b297812d-1aca-496c-a83e-72f4d8b54415\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.284190 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfpzx\" (UniqueName: \"kubernetes.io/projected/2da0aa85-6bbd-4fc9-b76a-00f1e51f8327-kube-api-access-gfpzx\") pod \"migrator-866fcbc849-csf6b\" (UID: \"2da0aa85-6bbd-4fc9-b76a-00f1e51f8327\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.294692 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.302062 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.302533 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:51.802517718 +0000 UTC m=+114.343829057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.303697 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7shxv\" (UniqueName: \"kubernetes.io/projected/4acc340c-81c7-4011-b17f-9f83eadd540e-kube-api-access-7shxv\") pod \"openshift-controller-manager-operator-686468bdd5-px2bg\" (UID: \"4acc340c-81c7-4011-b17f-9f83eadd540e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.307852 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.330800 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll65m\" (UniqueName: \"kubernetes.io/projected/caec81e5-958d-4139-aba7-2a5df11c25b1-kube-api-access-ll65m\") pod \"machine-config-server-kv7dd\" (UID: \"caec81e5-958d-4139-aba7-2a5df11c25b1\") " pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.346535 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c94tq\" (UniqueName: \"kubernetes.io/projected/d980efcf-1159-448a-ac6c-4ee5ddff2b66-kube-api-access-c94tq\") pod \"package-server-manager-77f986bd66-f99dz\" (UID: \"d980efcf-1159-448a-ac6c-4ee5ddff2b66\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.361103 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrzv6\" (UniqueName: \"kubernetes.io/projected/af87b4e5-15c0-48dc-9bc3-df39fcc24a53-kube-api-access-wrzv6\") pod \"csi-hostpathplugin-5zvch\" (UID: \"af87b4e5-15c0-48dc-9bc3-df39fcc24a53\") " pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.385432 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.403122 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-kv7dd" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.403848 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.404180 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:51.904168 +0000 UTC m=+114.445479339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.404184 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t644v\" (UniqueName: \"kubernetes.io/projected/22a20570-e6d3-4f7b-b8fd-bc3cf5716448-kube-api-access-t644v\") pod \"service-ca-74545575db-6qpvf\" (UID: \"22a20570-e6d3-4f7b-b8fd-bc3cf5716448\") " pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.414283 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxb7j\" (UniqueName: \"kubernetes.io/projected/1fa45036-e34b-4f40-9e02-838ed397f42e-kube-api-access-jxb7j\") pod \"olm-operator-5cdf44d969-mfddg\" (UID: \"1fa45036-e34b-4f40-9e02-838ed397f42e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.424196 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.425605 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.429904 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jczbt\" (UniqueName: \"kubernetes.io/projected/dcac2a22-f863-4ebe-8e3c-b88664a6c14d-kube-api-access-jczbt\") pod \"service-ca-operator-5b9c976747-dzzxj\" (UID: \"dcac2a22-f863-4ebe-8e3c-b88664a6c14d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.430258 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.437179 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" event={"ID":"a1d9df18-d5a1-447d-ad5a-fdef055a830a","Type":"ContainerStarted","Data":"818f2fbc5e6792c0f16efebf57cf83caa76a17d46ab3de37a9b0a5f5ff767e83"} Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.478343 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.480902 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" event={"ID":"027a90bb-52c1-43ed-a43d-6f9755019c9b","Type":"ContainerStarted","Data":"cc6ba26c639a1704b18379ddbc8726cb4832b7b24ea98863713071081c39033e"} Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.481472 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.481594 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5zvch" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.483251 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" event={"ID":"781710ac-8789-42cf-983e-f7de329e4e81","Type":"ContainerStarted","Data":"345f282c798fe842fad8f23322a7055599ab865a9feeafff29093e09c085b964"} Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.484183 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-747b44746d-t6987" event={"ID":"6f45ff33-e60b-4885-ac63-5ab182bf6320","Type":"ContainerStarted","Data":"58a0652a0f43898dd9100345c08f9e5ff250016d36a1037255f46c2985a1ac14"} Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.485834 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" event={"ID":"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc","Type":"ContainerStarted","Data":"8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3"} Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.487892 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" event={"ID":"1af1595b-1a79-438d-99a0-dd34b32cfcda","Type":"ContainerStarted","Data":"41d6ab36bbbac865e36cacbaca5826aa7b2b245991ba40ecb0ebd04ebf802fe9"} Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.489738 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjph9\" (UniqueName: \"kubernetes.io/projected/e947b0fa-3e07-4965-b693-8857cd4b98fd-kube-api-access-xjph9\") pod \"dns-default-f2fdq\" (UID: \"e947b0fa-3e07-4965-b693-8857cd4b98fd\") " pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.498959 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" event={"ID":"70e9334c-b259-45e5-88a3-6909ce233bda","Type":"ContainerStarted","Data":"5174e920dc8c67587e4d15592b32d823c44ca0711dc608fcbabdbbdc442221ee"} Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.501962 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkkfq\" (UniqueName: \"kubernetes.io/projected/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-kube-api-access-dkkfq\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.504963 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.505190 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsfb2\" (UniqueName: \"kubernetes.io/projected/4840f833-3dce-444b-8cad-3a7374af30e7-kube-api-access-nsfb2\") pod \"cni-sysctl-allowlist-ds-fm2vq\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.505317 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.005287997 +0000 UTC m=+114.546599336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.505572 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.506255 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.006243907 +0000 UTC m=+114.547555246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.513537 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xwwk\" (UniqueName: \"kubernetes.io/projected/c42a2703-d32e-41a7-accf-68b6e5d8c000-kube-api-access-8xwwk\") pod \"console-64d44f6ddf-qxthf\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.522958 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" event={"ID":"e1483fd4-8f3f-4326-874c-19e9c796d809","Type":"ContainerStarted","Data":"a92abe48f21b239a60e4a01566d22d1eb9e15fb474a7e459518e0bc45df961dd"} Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.526605 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.528195 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" event={"ID":"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc","Type":"ContainerStarted","Data":"1ed3aa05ff69f3c8da65d83152852f8984fdc822fbd1bb317dd10e75472e95f1"} Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.529338 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfz9t\" (UniqueName: \"kubernetes.io/projected/5ad036a8-381e-4761-a20f-8d8b9a3e9408-kube-api-access-kfz9t\") pod \"marketplace-operator-547dbd544d-dpld6\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.535100 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.548323 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0266bd06-b813-4f14-b94a-24d31805b311-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-cxwfx\" (UID: \"0266bd06-b813-4f14-b94a-24d31805b311\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.549507 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" event={"ID":"bbfdedba-967f-4b86-b7bd-a81854132b50","Type":"ContainerStarted","Data":"788e14db979ee601f0f13463135e29feb7b92aa3add983954faa11ae828033d2"} Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.561266 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.570963 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.579761 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7bnc\" (UniqueName: \"kubernetes.io/projected/0414ebc4-50d1-4b9e-966a-693c0957a5a5-kube-api-access-f7bnc\") pod \"multus-admission-controller-69db94689b-l8cgq\" (UID: \"0414ebc4-50d1-4b9e-966a-693c0957a5a5\") " pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.583570 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts5rb\" (UniqueName: \"kubernetes.io/projected/38127f6e-375c-4fe3-9070-1a9da91aa12f-kube-api-access-ts5rb\") pod \"catalog-operator-75ff9f647d-jskx5\" (UID: \"38127f6e-375c-4fe3-9070-1a9da91aa12f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.601685 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.607431 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.607601 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.107571501 +0000 UTC m=+114.648882840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.607991 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.608259 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.108245031 +0000 UTC m=+114.649556370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.609394 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zglg7\" (UniqueName: \"kubernetes.io/projected/8ea329fb-a095-4645-b64f-a5769efa6364-kube-api-access-zglg7\") pod \"collect-profiles-29555565-ms5vz\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:51 crc kubenswrapper[5184]: W0312 16:52:51.612534 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod935f19f6_af87_48ad_bd81_641676250fdd.slice/crio-ae0e40f6df034cf33a9a2dd7866f6b19cb21f6aad880f263647f3ec0039cb610 WatchSource:0}: Error finding container ae0e40f6df034cf33a9a2dd7866f6b19cb21f6aad880f263647f3ec0039cb610: Status 404 returned error can't find the container with id ae0e40f6df034cf33a9a2dd7866f6b19cb21f6aad880f263647f3ec0039cb610 Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.626695 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.627180 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c1f5c98-0cd5-40ba-8de2-cc45b81e196b-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-5vk9f\" (UID: \"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.636279 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.638169 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.644784 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-74545575db-6qpvf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.652340 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqpt6\" (UniqueName: \"kubernetes.io/projected/f06f8d9b-0e08-4f09-af38-cd987ab002f9-kube-api-access-zqpt6\") pod \"ingress-canary-tr5c8\" (UID: \"f06f8d9b-0e08-4f09-af38-cd987ab002f9\") " pod="openshift-ingress-canary/ingress-canary-tr5c8" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.665199 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.678675 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.694038 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.709012 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.709163 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.209145531 +0000 UTC m=+114.750456870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.717456 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.768867 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tr5c8" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.774976 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.778110 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqfjp\" (UniqueName: \"kubernetes.io/projected/73142984-30ee-40e5-8fdd-d024df118964-kube-api-access-vqfjp\") pod \"machine-config-operator-67c9d58cbb-wll7m\" (UID: \"73142984-30ee-40e5-8fdd-d024df118964\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.779635 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.781814 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.799091 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.812689 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.814392 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.814719 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.314703716 +0000 UTC m=+114.856015055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.819270 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.824827 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.850347 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-67c89758df-ssqr6"] Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.915736 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.915835 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.415813081 +0000 UTC m=+114.957124420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.916133 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:51 crc kubenswrapper[5184]: E0312 16:52:51.916455 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.416443031 +0000 UTC m=+114.957754370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:51 crc kubenswrapper[5184]: I0312 16:52:51.920941 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" Mar 12 16:52:51 crc kubenswrapper[5184]: W0312 16:52:51.941808 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2da0aa85_6bbd_4fc9_b76a_00f1e51f8327.slice/crio-53c3d4284f8b8da334f31321cb2d1cb65a262bcd44d0103e911f6bfd4cadcdaf WatchSource:0}: Error finding container 53c3d4284f8b8da334f31321cb2d1cb65a262bcd44d0103e911f6bfd4cadcdaf: Status 404 returned error can't find the container with id 53c3d4284f8b8da334f31321cb2d1cb65a262bcd44d0103e911f6bfd4cadcdaf Mar 12 16:52:52 crc kubenswrapper[5184]: W0312 16:52:52.013320 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4acc340c_81c7_4011_b17f_9f83eadd540e.slice/crio-d899f66010d98a9fb8b938b3cdada40e9a1c7dfd592d372ac0f9fc5f1bc7e338 WatchSource:0}: Error finding container d899f66010d98a9fb8b938b3cdada40e9a1c7dfd592d372ac0f9fc5f1bc7e338: Status 404 returned error can't find the container with id d899f66010d98a9fb8b938b3cdada40e9a1c7dfd592d372ac0f9fc5f1bc7e338 Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.017876 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.018388 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.518341293 +0000 UTC m=+115.059652632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.092148 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5zvch"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.097066 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.104964 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.129780 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.134765 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.135221 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.635205277 +0000 UTC m=+115.176516616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.210171 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.236845 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.237063 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.737030156 +0000 UTC m=+115.278341495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.237352 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.237529 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.237627 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.237687 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.237722 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.243390 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.243762 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.743742204 +0000 UTC m=+115.285053543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.257038 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.257059 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.258579 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.270911 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m"] Mar 12 16:52:52 crc kubenswrapper[5184]: W0312 16:52:52.277394 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa45036_e34b_4f40_9e02_838ed397f42e.slice/crio-eb2cb5d47dd8221ab9110c1ca2593ddd37669a6ecc4f7b94c66ecfdd9c7a092d WatchSource:0}: Error finding container eb2cb5d47dd8221ab9110c1ca2593ddd37669a6ecc4f7b94c66ecfdd9c7a092d: Status 404 returned error can't find the container with id eb2cb5d47dd8221ab9110c1ca2593ddd37669a6ecc4f7b94c66ecfdd9c7a092d Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.339028 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.339537 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.339582 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.839539786 +0000 UTC m=+115.380851125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.339695 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.340072 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.840054172 +0000 UTC m=+115.381365511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.343872 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df-metrics-certs\") pod \"network-metrics-daemon-vxc4c\" (UID: \"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df\") " pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.422816 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-74545575db-6qpvf"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.442002 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.442338 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.445817 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:52.945789442 +0000 UTC m=+115.487100781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.449329 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.471984 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.473837 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vxc4c" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.482315 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.544752 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.545222 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.045210006 +0000 UTC m=+115.586521345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.561284 5184 generic.go:358] "Generic (PLEG): container finished" podID="beb08b86-8593-4511-8bce-ea5f1d44f795" containerID="53b2f55fc98cad7a3fc91d28c52128aae072b528c873e79cf3cb6fa7933afc84" exitCode=0 Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.562029 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" event={"ID":"beb08b86-8593-4511-8bce-ea5f1d44f795","Type":"ContainerDied","Data":"53b2f55fc98cad7a3fc91d28c52128aae072b528c873e79cf3cb6fa7933afc84"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.564846 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-747b44746d-t6987" event={"ID":"6f45ff33-e60b-4885-ac63-5ab182bf6320","Type":"ContainerStarted","Data":"934f4f30f19d7990a40c822c80d1d803320772361ee9b512c171cf5536c09803"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.569446 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-747b44746d-t6987" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.572772 5184 patch_prober.go:28] interesting pod/downloads-747b44746d-t6987 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.573383 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-t6987" podUID="6f45ff33-e60b-4885-ac63-5ab182bf6320" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.579684 5184 generic.go:358] "Generic (PLEG): container finished" podID="5db01dce-a574-4dbd-97a9-582f0f357bda" containerID="3c3af475e4b32afd25c8bcbeed67ff30ca3a55f4492c6a3f85bc39768c24599d" exitCode=0 Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.581074 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" event={"ID":"5db01dce-a574-4dbd-97a9-582f0f357bda","Type":"ContainerDied","Data":"3c3af475e4b32afd25c8bcbeed67ff30ca3a55f4492c6a3f85bc39768c24599d"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.597983 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" event={"ID":"73142984-30ee-40e5-8fdd-d024df118964","Type":"ContainerStarted","Data":"8aa825c2b90d106b0128fe80b28a232e7a1cfbe7aaedc2aa1f1cb7353efe3c4b"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.603808 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tr5c8"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.605251 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-67c89758df-ssqr6" event={"ID":"8d1cd7e2-2062-46cc-8550-b41afd9716f4","Type":"ContainerStarted","Data":"1e4b263ea1aa908168a45b99286e4947f55388df26366a72ad749efcc7015909"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.609856 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" event={"ID":"17eed63d-a9fc-414e-9c70-347b51893cfa","Type":"ContainerStarted","Data":"e24cc4e11b3d0234458860254c11bc775f91db3fdc483a595a9056e5efcf156e"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.612477 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" event={"ID":"4840f833-3dce-444b-8cad-3a7374af30e7","Type":"ContainerStarted","Data":"0b978b30c0e1acb5a82743bafd0074aa28b88e7d42a2293ca045f786f56d7865"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.620064 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5zvch" event={"ID":"af87b4e5-15c0-48dc-9bc3-df39fcc24a53","Type":"ContainerStarted","Data":"a934e5063dc1a6491a25c700091cc4c10da791da1de4ec4ec21f7bbd94994c39"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.641312 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-dpld6"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.650964 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b" event={"ID":"2da0aa85-6bbd-4fc9-b76a-00f1e51f8327","Type":"ContainerStarted","Data":"53c3d4284f8b8da334f31321cb2d1cb65a262bcd44d0103e911f6bfd4cadcdaf"} Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.652312 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.152288987 +0000 UTC m=+115.693600326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.651353 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.653758 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.655955 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d44f6ddf-qxthf"] Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.659209 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.159185482 +0000 UTC m=+115.700496821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.672775 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" event={"ID":"640f2f33-9bd1-4378-97fb-61f78501c171","Type":"ContainerStarted","Data":"9fae62a1062b75d53cadd67190368a37e7d50b6c613dc9f15b64049a0e3b8300"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.687576 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" event={"ID":"ea99c433-2166-47f6-8c55-0787f78ff608","Type":"ContainerStarted","Data":"d6d87a4a6cc5a9e1e01f01479b204675fb95f5ee9c18e717c6eefbcd3a2d4496"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.699733 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=33.699708118 podStartE2EDuration="33.699708118s" podCreationTimestamp="2026-03-12 16:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:52.680282536 +0000 UTC m=+115.221593885" watchObservedRunningTime="2026-03-12 16:52:52.699708118 +0000 UTC m=+115.241019457" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.714933 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-l8cgq"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.717182 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.720145 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.725205 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" event={"ID":"9e13a3e9-eeee-4c55-a87a-11959e9f7497","Type":"ContainerStarted","Data":"41f50ad5be65e12968c4d7b10f45c19055d9057c54e3818037838e1acbd09ab1"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.727176 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" event={"ID":"0266bd06-b813-4f14-b94a-24d31805b311","Type":"ContainerStarted","Data":"c2f535c1df9e195e27129b822f656db7630925683bb6f6c7fced0ea43ff4deee"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.731671 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" event={"ID":"4acc340c-81c7-4011-b17f-9f83eadd540e","Type":"ContainerStarted","Data":"d899f66010d98a9fb8b938b3cdada40e9a1c7dfd592d372ac0f9fc5f1bc7e338"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.746939 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" event={"ID":"935f19f6-af87-48ad-bd81-641676250fdd","Type":"ContainerStarted","Data":"ae0e40f6df034cf33a9a2dd7866f6b19cb21f6aad880f263647f3ec0039cb610"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.748034 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.750969 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" event={"ID":"8b9d7d9b-02cb-4871-aaff-673af3457aa4","Type":"ContainerStarted","Data":"dcd708bed23af34bed3fc415e3b38ed68b652cf9382c65ed86445226def58ca3"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.757291 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" event={"ID":"e8c42229-7663-48d0-a009-893c96840034","Type":"ContainerStarted","Data":"994db4731878156a574d9b75174b79b24fc56d31b7ead6ddba6719d7046c23da"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.760039 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.760590 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.260551986 +0000 UTC m=+115.801863325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.760701 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.760774 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" event={"ID":"1fa45036-e34b-4f40-9e02-838ed397f42e","Type":"ContainerStarted","Data":"eb2cb5d47dd8221ab9110c1ca2593ddd37669a6ecc4f7b94c66ecfdd9c7a092d"} Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.761783 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.261769674 +0000 UTC m=+115.803081013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.765184 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" event={"ID":"dcac2a22-f863-4ebe-8e3c-b88664a6c14d","Type":"ContainerStarted","Data":"bbeb9a98dbde6a1cf31b67a95e6681705f54e749584fdd0d029b85c1395fea59"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.770729 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" event={"ID":"b6b85787-d6d2-48df-9830-ca4532adee38","Type":"ContainerStarted","Data":"efb03ac53ee9e76723c39552ea85b0d2cd3693cccb028c6c419243531f7c0df8"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.775277 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" event={"ID":"b297812d-1aca-496c-a83e-72f4d8b54415","Type":"ContainerStarted","Data":"bf088351a73298356b07b3a34dd4d4ef7004b179a77cdf5202f1caba91718992"} Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.776355 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.777777 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f2fdq"] Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.781464 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kv7dd" event={"ID":"caec81e5-958d-4139-aba7-2a5df11c25b1","Type":"ContainerStarted","Data":"48e5e5d175723908899e572bce5204dd53742818b163ea3b15235dc767eed9d8"} Mar 12 16:52:52 crc kubenswrapper[5184]: W0312 16:52:52.783318 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea329fb_a095_4645_b64f_a5769efa6364.slice/crio-391b81a450cd67b5be9aa3f9a77494b116252263267e2b626b6d0ae0e8dd9045 WatchSource:0}: Error finding container 391b81a450cd67b5be9aa3f9a77494b116252263267e2b626b6d0ae0e8dd9045: Status 404 returned error can't find the container with id 391b81a450cd67b5be9aa3f9a77494b116252263267e2b626b6d0ae0e8dd9045 Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.789817 5184 patch_prober.go:28] interesting pod/packageserver-7d4fc7d867-6bgzr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.789860 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" podUID="b297812d-1aca-496c-a83e-72f4d8b54415" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Mar 12 16:52:52 crc kubenswrapper[5184]: W0312 16:52:52.816853 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42a2703_d32e_41a7_accf_68b6e5d8c000.slice/crio-6a5e6e0a6ae46b3e434b251459a3375973490cec39b8820189bfa74a95465a90 WatchSource:0}: Error finding container 6a5e6e0a6ae46b3e434b251459a3375973490cec39b8820189bfa74a95465a90: Status 404 returned error can't find the container with id 6a5e6e0a6ae46b3e434b251459a3375973490cec39b8820189bfa74a95465a90 Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.861489 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.862409 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.362343663 +0000 UTC m=+115.903655002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:52 crc kubenswrapper[5184]: W0312 16:52:52.869399 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c1f5c98_0cd5_40ba_8de2_cc45b81e196b.slice/crio-33b12c91b238c29fda4fa2ba6d487fa7c94997d9605d03106c1a67b86d92f731 WatchSource:0}: Error finding container 33b12c91b238c29fda4fa2ba6d487fa7c94997d9605d03106c1a67b86d92f731: Status 404 returned error can't find the container with id 33b12c91b238c29fda4fa2ba6d487fa7c94997d9605d03106c1a67b86d92f731 Mar 12 16:52:52 crc kubenswrapper[5184]: W0312 16:52:52.870840 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0414ebc4_50d1_4b9e_966a_693c0957a5a5.slice/crio-311c3863b1482b3618cfecce01bd3537536ef8e1f1415a2aada2aee8bd645020 WatchSource:0}: Error finding container 311c3863b1482b3618cfecce01bd3537536ef8e1f1415a2aada2aee8bd645020: Status 404 returned error can't find the container with id 311c3863b1482b3618cfecce01bd3537536ef8e1f1415a2aada2aee8bd645020 Mar 12 16:52:52 crc kubenswrapper[5184]: I0312 16:52:52.963447 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:52 crc kubenswrapper[5184]: E0312 16:52:52.963934 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.463920744 +0000 UTC m=+116.005232073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.064218 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.064386 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.564345 +0000 UTC m=+116.105656339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.064586 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.064901 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.564889577 +0000 UTC m=+116.106200916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.167435 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.167834 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.667808359 +0000 UTC m=+116.209119698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.168586 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.169398 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.669388519 +0000 UTC m=+116.210699858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.234835 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" podStartSLOduration=94.234818398 podStartE2EDuration="1m34.234818398s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.199628956 +0000 UTC m=+115.740940305" watchObservedRunningTime="2026-03-12 16:52:53.234818398 +0000 UTC m=+115.776129737" Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.235917 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vxc4c"] Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.249310 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" podStartSLOduration=94.249290617 podStartE2EDuration="1m34.249290617s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.248469581 +0000 UTC m=+115.789780920" watchObservedRunningTime="2026-03-12 16:52:53.249290617 +0000 UTC m=+115.790601956" Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.269693 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.269877 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.769849045 +0000 UTC m=+116.311160384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.270174 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.270508 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.770501525 +0000 UTC m=+116.311812864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.317392 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" podStartSLOduration=93.317362369 podStartE2EDuration="1m33.317362369s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.317311837 +0000 UTC m=+115.858623176" watchObservedRunningTime="2026-03-12 16:52:53.317362369 +0000 UTC m=+115.858673708" Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.319348 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" podStartSLOduration=93.31933995 podStartE2EDuration="1m33.31933995s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.284143968 +0000 UTC m=+115.825455307" watchObservedRunningTime="2026-03-12 16:52:53.31933995 +0000 UTC m=+115.860651289" Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.364601 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-mpvq4" podStartSLOduration=94.364578303 podStartE2EDuration="1m34.364578303s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.358698901 +0000 UTC m=+115.900010240" watchObservedRunningTime="2026-03-12 16:52:53.364578303 +0000 UTC m=+115.905889642" Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.370938 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.371954 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.871929811 +0000 UTC m=+116.413241160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.403775 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-b2bj4" podStartSLOduration=94.403755608 podStartE2EDuration="1m34.403755608s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.400826557 +0000 UTC m=+115.942137896" watchObservedRunningTime="2026-03-12 16:52:53.403755608 +0000 UTC m=+115.945066947" Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.473076 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.473416 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:53.973403999 +0000 UTC m=+116.514715338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.496930 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-747b44746d-t6987" podStartSLOduration=94.496915128 podStartE2EDuration="1m34.496915128s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.495557186 +0000 UTC m=+116.036868525" watchObservedRunningTime="2026-03-12 16:52:53.496915128 +0000 UTC m=+116.038226467" Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.573692 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.574038 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:54.07402237 +0000 UTC m=+116.615333709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: W0312 16:52:53.575075 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf863fff9_286a_45fa_b8f0_8a86994b8440.slice/crio-d9af4e7b3f8d1d5a1031bcba86dc6c82484f8504f606ffc6f49e3ec809dd66fb WatchSource:0}: Error finding container d9af4e7b3f8d1d5a1031bcba86dc6c82484f8504f606ffc6f49e3ec809dd66fb: Status 404 returned error can't find the container with id d9af4e7b3f8d1d5a1031bcba86dc6c82484f8504f606ffc6f49e3ec809dd66fb Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.678046 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.678546 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:54.178526202 +0000 UTC m=+116.719837541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.779103 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.779307 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:54.279272327 +0000 UTC m=+116.820583666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.779613 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.779973 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:54.279950808 +0000 UTC m=+116.821262137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.837612 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" event={"ID":"e1483fd4-8f3f-4326-874c-19e9c796d809","Type":"ContainerStarted","Data":"e0e936ef3f58988d458f2d4da8ca7593759e6bf5d4c6e7b035cec60cace71525"} Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.846909 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" event={"ID":"e8c42229-7663-48d0-a009-893c96840034","Type":"ContainerStarted","Data":"6ee9ca3ece8388ce9ddc674976f36ece33598b5b44c4959621c25ff5a16cba66"} Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.866005 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-74545575db-6qpvf" event={"ID":"22a20570-e6d3-4f7b-b8fd-bc3cf5716448","Type":"ContainerStarted","Data":"e20fec01e85b148387f483aea12ff12d0d00764189eea6f2d269c755d0267126"} Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.874799 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podStartSLOduration=93.87478294 podStartE2EDuration="1m33.87478294s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.872606582 +0000 UTC m=+116.413917941" watchObservedRunningTime="2026-03-12 16:52:53.87478294 +0000 UTC m=+116.416094279" Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.885510 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.886664 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:54.386649627 +0000 UTC m=+116.927960966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.920785 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" event={"ID":"20de7db3-2a1d-49b2-a756-3ef5b88fbfcc","Type":"ContainerStarted","Data":"dfdba2ccf20f3b23b9f3cfda545f877db691ad95a16b608330951fa82e3c93e7"} Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.930309 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" event={"ID":"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6","Type":"ContainerStarted","Data":"758b78bdc318924bd1998420db7e1dea751d65f1b1ed8fa0b76c181fbabc49dc"} Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.930367 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" event={"ID":"36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6","Type":"ContainerStarted","Data":"697ccaacf7953b24129fe42204e5365471606af113ec5b6e52c6f04bb1eb56cf"} Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.935718 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" event={"ID":"38127f6e-375c-4fe3-9070-1a9da91aa12f","Type":"ContainerStarted","Data":"00d48c11e54662d9fc18e3d1518ea10e59497bed781aac0238b9141d9820467a"} Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.939429 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-54c688565-jqlh7" podStartSLOduration=94.939414235 podStartE2EDuration="1m34.939414235s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.936768053 +0000 UTC m=+116.478079392" watchObservedRunningTime="2026-03-12 16:52:53.939414235 +0000 UTC m=+116.480725574" Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.961991 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-755bb95488-d29hz" podStartSLOduration=93.961973255 podStartE2EDuration="1m33.961973255s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.959877219 +0000 UTC m=+116.501188558" watchObservedRunningTime="2026-03-12 16:52:53.961973255 +0000 UTC m=+116.503284594" Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.977507 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" event={"ID":"b6b85787-d6d2-48df-9830-ca4532adee38","Type":"ContainerStarted","Data":"a07985a5c65746324d5b0dd345917a216108d17b43b08b9b0a96c9ac87b616dc"} Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.990193 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:53 crc kubenswrapper[5184]: E0312 16:52:53.992453 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:54.49243465 +0000 UTC m=+117.033745989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:53 crc kubenswrapper[5184]: I0312 16:52:53.993281 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-kv7dd" event={"ID":"caec81e5-958d-4139-aba7-2a5df11c25b1","Type":"ContainerStarted","Data":"e95cb5aea8c064e3740a0bc6ccec54e723d931b65d558fba30910556748c7f98"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.002365 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m7sz7" podStartSLOduration=94.002340506 podStartE2EDuration="1m34.002340506s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:53.998545089 +0000 UTC m=+116.539856428" watchObservedRunningTime="2026-03-12 16:52:54.002340506 +0000 UTC m=+116.543651845" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.021614 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-kv7dd" podStartSLOduration=7.021594764 podStartE2EDuration="7.021594764s" podCreationTimestamp="2026-03-12 16:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.020316474 +0000 UTC m=+116.561627813" watchObservedRunningTime="2026-03-12 16:52:54.021594764 +0000 UTC m=+116.562906103" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.038552 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" event={"ID":"0414ebc4-50d1-4b9e-966a-693c0957a5a5","Type":"ContainerStarted","Data":"311c3863b1482b3618cfecce01bd3537536ef8e1f1415a2aada2aee8bd645020"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.058327 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" event={"ID":"640f2f33-9bd1-4378-97fb-61f78501c171","Type":"ContainerStarted","Data":"a640a40501e18211e133fd725df48906eb7227755ae83678278bb06ecc4fdbc4"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.073816 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" event={"ID":"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141","Type":"ContainerStarted","Data":"ab5606d9df5a6f9ee165066ef9f2eeba08ee1878786088f7c550eef1f23f4ac6"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.091574 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.092111 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:54.592091861 +0000 UTC m=+117.133403200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.095615 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-qxthf" event={"ID":"c42a2703-d32e-41a7-accf-68b6e5d8c000","Type":"ContainerStarted","Data":"6a5e6e0a6ae46b3e434b251459a3375973490cec39b8820189bfa74a95465a90"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.098856 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" event={"ID":"8ea329fb-a095-4645-b64f-a5769efa6364","Type":"ContainerStarted","Data":"391b81a450cd67b5be9aa3f9a77494b116252263267e2b626b6d0ae0e8dd9045"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.130524 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" event={"ID":"1af1595b-1a79-438d-99a0-dd34b32cfcda","Type":"ContainerStarted","Data":"06390594defef07d05a73aa16c4c682052cf5fd825d2eaf787830891cfd50015"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.149590 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" event={"ID":"70e9334c-b259-45e5-88a3-6909ce233bda","Type":"ContainerStarted","Data":"af71212a17e25972df7b22207887332371e637d8d09c1c151cd0ed1f4ecd4686"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.175495 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" event={"ID":"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b","Type":"ContainerStarted","Data":"33b12c91b238c29fda4fa2ba6d487fa7c94997d9605d03106c1a67b86d92f731"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.186230 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-b6v2k" podStartSLOduration=95.18620827 podStartE2EDuration="1m35.18620827s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.185455467 +0000 UTC m=+116.726766806" watchObservedRunningTime="2026-03-12 16:52:54.18620827 +0000 UTC m=+116.727519609" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.187877 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-799b87ffcd-rqzhf" podStartSLOduration=95.187870402 podStartE2EDuration="1m35.187870402s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.153945029 +0000 UTC m=+116.695256368" watchObservedRunningTime="2026-03-12 16:52:54.187870402 +0000 UTC m=+116.729181741" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.193099 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.193832 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:54.693816046 +0000 UTC m=+117.235127385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.198960 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.204447 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" event={"ID":"4acc340c-81c7-4011-b17f-9f83eadd540e","Type":"ContainerStarted","Data":"39a740365ac3cb04e7d386b8f8ac8cc7c196a927c2a427db57388687c917b2e1"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.204941 5184 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-7pgjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 16:52:54 crc kubenswrapper[5184]: [-]has-synced failed: reason withheld Mar 12 16:52:54 crc kubenswrapper[5184]: [+]process-running ok Mar 12 16:52:54 crc kubenswrapper[5184]: healthz check failed Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.204991 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podUID="e1483fd4-8f3f-4326-874c-19e9c796d809" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.245132 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-px2bg" podStartSLOduration=95.245111207 podStartE2EDuration="1m35.245111207s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.23972125 +0000 UTC m=+116.781032599" watchObservedRunningTime="2026-03-12 16:52:54.245111207 +0000 UTC m=+116.786422546" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.295364 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.296506 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:54.796483371 +0000 UTC m=+117.337794710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.373258 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" event={"ID":"935f19f6-af87-48ad-bd81-641676250fdd","Type":"ContainerStarted","Data":"c2d37ce14d721c28cc01ab469bcedbf1fa9feb24f6e63bdd4d2fa4bf3cfe8532"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.399396 5184 generic.go:358] "Generic (PLEG): container finished" podID="bbfdedba-967f-4b86-b7bd-a81854132b50" containerID="54542bed427ec3f1215c59142993ae0aecb333dbfc212ed9b7f9bb00e648626a" exitCode=0 Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.399506 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" event={"ID":"bbfdedba-967f-4b86-b7bd-a81854132b50","Type":"ContainerDied","Data":"54542bed427ec3f1215c59142993ae0aecb333dbfc212ed9b7f9bb00e648626a"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.400410 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.400838 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:54.900821168 +0000 UTC m=+117.442132507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.426492 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" event={"ID":"1fa45036-e34b-4f40-9e02-838ed397f42e","Type":"ContainerStarted","Data":"bc94ff7bce93cfc10ce3b7aec4e1ddbe55d8dd4a6b21b43627e7ed30ccab10de"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.427275 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.436585 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-69b85846b6-5dwhg" podStartSLOduration=95.436558206 podStartE2EDuration="1m35.436558206s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.39282975 +0000 UTC m=+116.934141099" watchObservedRunningTime="2026-03-12 16:52:54.436558206 +0000 UTC m=+116.977869555" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.439885 5184 patch_prober.go:28] interesting pod/olm-operator-5cdf44d969-mfddg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.439938 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" podUID="1fa45036-e34b-4f40-9e02-838ed397f42e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.462902 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" event={"ID":"dcac2a22-f863-4ebe-8e3c-b88664a6c14d","Type":"ContainerStarted","Data":"252eb5de1fb609c8428ee188e73c6af46f586b4146a3ceb4f89eac759965c89b"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.463060 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" podStartSLOduration=94.463037958 podStartE2EDuration="1m34.463037958s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.46151051 +0000 UTC m=+117.002821849" watchObservedRunningTime="2026-03-12 16:52:54.463037958 +0000 UTC m=+117.004349297" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.501407 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.502976 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.002942075 +0000 UTC m=+117.544253424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.503202 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.504426 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.004413311 +0000 UTC m=+117.545724750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.511619 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" event={"ID":"17b87002-b798-480a-8e17-83053d698239","Type":"ContainerStarted","Data":"0f571537ccac724044d9b01a62e155d05f8a42fdb3cc7351ce61980320c59ca2"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.512879 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" event={"ID":"d980efcf-1159-448a-ac6c-4ee5ddff2b66","Type":"ContainerStarted","Data":"53c3bc1c3a6f15db38e3fb13a1d7b171e37f05cf33b15d80e4352cd2285af597"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.521710 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" event={"ID":"781710ac-8789-42cf-983e-f7de329e4e81","Type":"ContainerStarted","Data":"bf77b230d5b44efea3415bdddda17d175e6a8de172b71ea90053045121e833bb"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.534517 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" event={"ID":"b297812d-1aca-496c-a83e-72f4d8b54415","Type":"ContainerStarted","Data":"64d2316a87dba523283c9ec0e8e4eda71d47fea7847bde01c04fef00a403d88c"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.550265 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-dzzxj" podStartSLOduration=94.550246352 podStartE2EDuration="1m34.550246352s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.486855766 +0000 UTC m=+117.028167135" watchObservedRunningTime="2026-03-12 16:52:54.550246352 +0000 UTC m=+117.091557691" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.554280 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" event={"ID":"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df","Type":"ContainerStarted","Data":"8336288c82e12c7f42a766fa8dd124f373b0fdecfb0019d6ce560236cfd3653b"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.557587 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f2fdq" event={"ID":"e947b0fa-3e07-4965-b693-8857cd4b98fd","Type":"ContainerStarted","Data":"fb2cddbe96b41635b620eb832f50f16ecf7bc78b6c132912756736821b8537e0"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.582723 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" event={"ID":"5ad036a8-381e-4761-a20f-8d8b9a3e9408","Type":"ContainerStarted","Data":"73a2dcd0415c36c1f0542204ea1babe908c9e7a2189f4c874f45c0760ae543c4"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.583706 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.588442 5184 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-dpld6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.588753 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" podUID="5ad036a8-381e-4761-a20f-8d8b9a3e9408" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.597998 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-67c89758df-ssqr6" event={"ID":"8d1cd7e2-2062-46cc-8550-b41afd9716f4","Type":"ContainerStarted","Data":"78f4185773f77525d3aa359c47dc5d5ce805921d2ce3712f5b656d44650447ad"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.599181 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.600142 5184 patch_prober.go:28] interesting pod/console-operator-67c89758df-ssqr6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.601870 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-p29gv" podStartSLOduration=94.601853614 podStartE2EDuration="1m34.601853614s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.550961065 +0000 UTC m=+117.092272434" watchObservedRunningTime="2026-03-12 16:52:54.601853614 +0000 UTC m=+117.143164943" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.603030 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-67c89758df-ssqr6" podUID="8d1cd7e2-2062-46cc-8550-b41afd9716f4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.603950 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.604010 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.10399119 +0000 UTC m=+117.645302529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.604676 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.606605 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.10659262 +0000 UTC m=+117.647903959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.614642 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b" event={"ID":"2da0aa85-6bbd-4fc9-b76a-00f1e51f8327","Type":"ContainerStarted","Data":"a8a63187f8224f5c8ba936ad576c884bd097619eda4dba219c91969fed83d138"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.629421 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" event={"ID":"f863fff9-286a-45fa-b8f0-8a86994b8440","Type":"ContainerStarted","Data":"d9af4e7b3f8d1d5a1031bcba86dc6c82484f8504f606ffc6f49e3ec809dd66fb"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.633253 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" podStartSLOduration=94.633232867 podStartE2EDuration="1m34.633232867s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.603412252 +0000 UTC m=+117.144723591" watchObservedRunningTime="2026-03-12 16:52:54.633232867 +0000 UTC m=+117.174544206" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.641672 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" event={"ID":"9e13a3e9-eeee-4c55-a87a-11959e9f7497","Type":"ContainerStarted","Data":"2baa33d79aa3aa6d4b01f3f12a039fa1da82b4b0d973ae2d3f4ab6ded882dab6"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.670972 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tr5c8" event={"ID":"f06f8d9b-0e08-4f09-af38-cd987ab002f9","Type":"ContainerStarted","Data":"0da9ad8a9329f9a55ddb3f0a0de9a92ad5c920fc86782c68c67b63cedfe0ff8f"} Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.671329 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.671606 5184 patch_prober.go:28] interesting pod/downloads-747b44746d-t6987 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.671660 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-t6987" podUID="6f45ff33-e60b-4885-ac63-5ab182bf6320" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.689864 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.695956 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-67c89758df-ssqr6" podStartSLOduration=95.695931662 podStartE2EDuration="1m35.695931662s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.645055884 +0000 UTC m=+117.186367283" watchObservedRunningTime="2026-03-12 16:52:54.695931662 +0000 UTC m=+117.237243001" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.706368 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.708271 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.208249184 +0000 UTC m=+117.749560523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.732074 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b" podStartSLOduration=94.732055022 podStartE2EDuration="1m34.732055022s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.698960026 +0000 UTC m=+117.240271375" watchObservedRunningTime="2026-03-12 16:52:54.732055022 +0000 UTC m=+117.273366361" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.810319 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.810765 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.310752183 +0000 UTC m=+117.852063522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.830016 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-fnw98" podStartSLOduration=94.829998331 podStartE2EDuration="1m34.829998331s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.784030625 +0000 UTC m=+117.325341984" watchObservedRunningTime="2026-03-12 16:52:54.829998331 +0000 UTC m=+117.371309670" Mar 12 16:52:54 crc kubenswrapper[5184]: I0312 16:52:54.912123 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:54 crc kubenswrapper[5184]: E0312 16:52:54.912366 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.412350045 +0000 UTC m=+117.953661384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.013312 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.013825 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.513813803 +0000 UTC m=+118.055125142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.114556 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.114682 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.614667291 +0000 UTC m=+118.155978630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.114851 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.115182 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.615172977 +0000 UTC m=+118.156484316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.206333 5184 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-7pgjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 16:52:55 crc kubenswrapper[5184]: [-]has-synced failed: reason withheld Mar 12 16:52:55 crc kubenswrapper[5184]: [+]process-running ok Mar 12 16:52:55 crc kubenswrapper[5184]: healthz check failed Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.206611 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podUID="e1483fd4-8f3f-4326-874c-19e9c796d809" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.215507 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.215831 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.715813019 +0000 UTC m=+118.257124358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.320914 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.820900488 +0000 UTC m=+118.362211827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.320644 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.421983 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.422101 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.922082137 +0000 UTC m=+118.463393486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.422233 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.422586 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:55.922577363 +0000 UTC m=+118.463888702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.523290 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.523486 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.023460622 +0000 UTC m=+118.564771961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.523943 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.524268 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.024252047 +0000 UTC m=+118.565563386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.542243 5184 patch_prober.go:28] interesting pod/packageserver-7d4fc7d867-6bgzr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": context deadline exceeded" start-of-body= Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.542307 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" podUID="b297812d-1aca-496c-a83e-72f4d8b54415" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": context deadline exceeded" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.625551 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.625723 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.125696963 +0000 UTC m=+118.667008302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.625960 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.626284 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.126271821 +0000 UTC m=+118.667583160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.704077 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" event={"ID":"bbfdedba-967f-4b86-b7bd-a81854132b50","Type":"ContainerStarted","Data":"b5ac14c29e7d62637176f980f28e443b28ec01b3dff313b00b0398cf28debfd8"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.704240 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.713893 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" event={"ID":"17b87002-b798-480a-8e17-83053d698239","Type":"ContainerStarted","Data":"0cc4dbe924d34d328d5afd48bf7faafbb18913b639e13549d3a25e9350ac9d7a"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.714053 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.726646 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.726811 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.226794509 +0000 UTC m=+118.768105848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.726910 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.727201 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.227191192 +0000 UTC m=+118.768502531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.728169 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" event={"ID":"d980efcf-1159-448a-ac6c-4ee5ddff2b66","Type":"ContainerStarted","Data":"b44815626501e0c45791df94a2a43b65133bb5f2b0da545bb0ce67b642282111"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.728218 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" event={"ID":"d980efcf-1159-448a-ac6c-4ee5ddff2b66","Type":"ContainerStarted","Data":"d1f8ce5951c7f405a83d115566af9025f7e97fb39941397c605870e108a2f339"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.728438 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.746400 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" event={"ID":"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df","Type":"ContainerStarted","Data":"ac818802aa9866af7c128e9a5e0929d9228712faae20256c0c9e7fea1ec9f689"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.758089 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f2fdq" event={"ID":"e947b0fa-3e07-4965-b693-8857cd4b98fd","Type":"ContainerStarted","Data":"c64f02fb048a918784ff334fee4fd289b7cb025a467e129fe3ea798cb1e5d2cb"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.759830 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" podStartSLOduration=96.759818814 podStartE2EDuration="1m36.759818814s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:55.758061249 +0000 UTC m=+118.299372588" watchObservedRunningTime="2026-03-12 16:52:55.759818814 +0000 UTC m=+118.301130153" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.762990 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tr5c8" podStartSLOduration=7.7629790419999996 podStartE2EDuration="7.762979042s" podCreationTimestamp="2026-03-12 16:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:54.830343491 +0000 UTC m=+117.371654830" watchObservedRunningTime="2026-03-12 16:52:55.762979042 +0000 UTC m=+118.304290381" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.763612 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-f2fdq" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.763650 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f2fdq" event={"ID":"e947b0fa-3e07-4965-b693-8857cd4b98fd","Type":"ContainerStarted","Data":"0c7ce1b74424fd673e1fb9f3ff5c4802f076a43b3b60dd4d6e386889b68593af"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.768760 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" event={"ID":"5ad036a8-381e-4761-a20f-8d8b9a3e9408","Type":"ContainerStarted","Data":"babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.769839 5184 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-dpld6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.769879 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" podUID="5ad036a8-381e-4761-a20f-8d8b9a3e9408" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.772103 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" event={"ID":"5db01dce-a574-4dbd-97a9-582f0f357bda","Type":"ContainerStarted","Data":"e9c3addf9a44b73207088f467e07415b92725a3d4177ecc79040b7174724d9f8"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.797024 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-csf6b" event={"ID":"2da0aa85-6bbd-4fc9-b76a-00f1e51f8327","Type":"ContainerStarted","Data":"892a18c83fdc856edf13ce81e9fa4129524bd2e5bf788924b2ca6e643b40778c"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.823085 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" event={"ID":"f863fff9-286a-45fa-b8f0-8a86994b8440","Type":"ContainerStarted","Data":"26e9aa000a1eb77a3b3375a85cdc9beb23e3ef0f5fe8fb295b0a3e451a57a79d"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.831658 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.831815 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.331785647 +0000 UTC m=+118.873096986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.831898 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.834235 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.334217672 +0000 UTC m=+118.875529091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.846867 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tr5c8" event={"ID":"f06f8d9b-0e08-4f09-af38-cd987ab002f9","Type":"ContainerStarted","Data":"9b57ec6596c08e5c2007a306bfb157901d9f0c799a11663d81e9a3706d9dddbd"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.865703 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" event={"ID":"0266bd06-b813-4f14-b94a-24d31805b311","Type":"ContainerStarted","Data":"27e754d9aebe37c857ad9304f048b23bcce06a5e9682ba9d1151e264e8791389"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.888945 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" event={"ID":"8b9d7d9b-02cb-4871-aaff-673af3457aa4","Type":"ContainerStarted","Data":"126b5fb0faf6621c7da01160985e6bb4e7b98abc2e7b937baee190e223f97a03"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.897917 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-f2fdq" podStartSLOduration=7.897903287 podStartE2EDuration="7.897903287s" podCreationTimestamp="2026-03-12 16:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:55.896853984 +0000 UTC m=+118.438165323" watchObservedRunningTime="2026-03-12 16:52:55.897903287 +0000 UTC m=+118.439214626" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.899674 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" podStartSLOduration=95.899653561 podStartE2EDuration="1m35.899653561s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:55.830750514 +0000 UTC m=+118.372061853" watchObservedRunningTime="2026-03-12 16:52:55.899653561 +0000 UTC m=+118.440964890" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.910633 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" event={"ID":"e8c42229-7663-48d0-a009-893c96840034","Type":"ContainerStarted","Data":"56a279d54859933f55493df7de6894c91b5b0d65d599dda12ac0cb9bdd56b6e3"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.933361 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:55 crc kubenswrapper[5184]: E0312 16:52:55.934390 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.434360778 +0000 UTC m=+118.975672117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.944734 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-74545575db-6qpvf" event={"ID":"22a20570-e6d3-4f7b-b8fd-bc3cf5716448","Type":"ContainerStarted","Data":"ef17b3365797ebf1e40ad73322317156726e1ca7d32c3caac8f502b692ab8d99"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.964650 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" event={"ID":"38127f6e-375c-4fe3-9070-1a9da91aa12f","Type":"ContainerStarted","Data":"c577c4a197036faaf8a204d94e0f3706babb00493f75b34132001253c63bc356"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.965535 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.969188 5184 patch_prober.go:28] interesting pod/catalog-operator-75ff9f647d-jskx5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.969231 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" podUID="38127f6e-375c-4fe3-9070-1a9da91aa12f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.991691 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" podStartSLOduration=95.991674857 podStartE2EDuration="1m35.991674857s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:55.990133718 +0000 UTC m=+118.531445057" watchObservedRunningTime="2026-03-12 16:52:55.991674857 +0000 UTC m=+118.532986196" Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.999421 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" event={"ID":"beb08b86-8593-4511-8bce-ea5f1d44f795","Type":"ContainerStarted","Data":"cf954cf7005249ec891196080ddf9dab7e47d40e8b5f3ae467eab00dc2636781"} Mar 12 16:52:55 crc kubenswrapper[5184]: I0312 16:52:55.999481 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" event={"ID":"beb08b86-8593-4511-8bce-ea5f1d44f795","Type":"ContainerStarted","Data":"c78a38b472259546a5eae1d9656ddf452a775a25b93cc384001344f9661bb99f"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.014841 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" event={"ID":"73142984-30ee-40e5-8fdd-d024df118964","Type":"ContainerStarted","Data":"b4164a44e285c0f1542840fd1d130d10fbde15cfc56fd51b23f50aae168ce845"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.014901 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" event={"ID":"73142984-30ee-40e5-8fdd-d024df118964","Type":"ContainerStarted","Data":"808f50e56582bf77fecee8a4d312bac0f1c72fa98f06226a4aa54f0aec52bb11"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.021755 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" event={"ID":"0414ebc4-50d1-4b9e-966a-693c0957a5a5","Type":"ContainerStarted","Data":"edaf650a1656f5ffcc3b6377d619041912120e6d01036f565f40fd6abb9f83b3"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.032932 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" event={"ID":"4840f833-3dce-444b-8cad-3a7374af30e7","Type":"ContainerStarted","Data":"33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.033656 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.034345 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.035556 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.535539207 +0000 UTC m=+119.076850616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.038566 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" podStartSLOduration=96.038549601 podStartE2EDuration="1m36.038549601s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.014579566 +0000 UTC m=+118.555890905" watchObservedRunningTime="2026-03-12 16:52:56.038549601 +0000 UTC m=+118.579860940" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.038781 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-74545575db-6qpvf" podStartSLOduration=96.038777358 podStartE2EDuration="1m36.038777358s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.036533858 +0000 UTC m=+118.577845207" watchObservedRunningTime="2026-03-12 16:52:56.038777358 +0000 UTC m=+118.580088697" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.062745 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" event={"ID":"640f2f33-9bd1-4378-97fb-61f78501c171","Type":"ContainerStarted","Data":"0f1753600749bf37f6470954add68453ae1de31139f1e2c88f1c7f6027a726c2"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.063927 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-d8dpv" podStartSLOduration=96.063915857 podStartE2EDuration="1m36.063915857s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.062529824 +0000 UTC m=+118.603841163" watchObservedRunningTime="2026-03-12 16:52:56.063915857 +0000 UTC m=+118.605227196" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.074761 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" event={"ID":"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141","Type":"ContainerStarted","Data":"d748b81469e8bff71035f4d66e5f8d077a865fba01e30168d0bcc3469e9d58fa"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.096757 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-qxthf" event={"ID":"c42a2703-d32e-41a7-accf-68b6e5d8c000","Type":"ContainerStarted","Data":"86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.098453 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-cxwfx" podStartSLOduration=96.098442038 podStartE2EDuration="1m36.098442038s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.096687124 +0000 UTC m=+118.637998453" watchObservedRunningTime="2026-03-12 16:52:56.098442038 +0000 UTC m=+118.639753377" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.110175 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.117628 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" event={"ID":"8ea329fb-a095-4645-b64f-a5769efa6364","Type":"ContainerStarted","Data":"95aff8907da74210f0f5111c0984339b383948ad0229faf5ce3eabf1cafca580"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.133564 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" event={"ID":"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b","Type":"ContainerStarted","Data":"2f4a27f8f38abede6b4214928c778f7d0b2db498e462262896c36b818c24f086"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.133601 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" event={"ID":"6c1f5c98-0cd5-40ba-8de2-cc45b81e196b","Type":"ContainerStarted","Data":"76739312ef90d29ddc74635cda8252d1dd27e0e0bd2f4a9cba9a5b3b607a9e68"} Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.135645 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.135918 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.63588726 +0000 UTC m=+119.177198589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.136652 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.148074 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.648054808 +0000 UTC m=+119.189366147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.148453 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-mfddg" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.148568 5184 patch_prober.go:28] interesting pod/downloads-747b44746d-t6987 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.148640 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-t6987" podUID="6f45ff33-e60b-4885-ac63-5ab182bf6320" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.159823 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-w2ldh" podStartSLOduration=96.159806472 podStartE2EDuration="1m36.159806472s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.129988857 +0000 UTC m=+118.671300196" watchObservedRunningTime="2026-03-12 16:52:56.159806472 +0000 UTC m=+118.701117811" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.160327 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" podStartSLOduration=96.160316867 podStartE2EDuration="1m36.160316867s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.158240263 +0000 UTC m=+118.699551602" watchObservedRunningTime="2026-03-12 16:52:56.160316867 +0000 UTC m=+118.701628206" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.202133 5184 ???:1] "http: TLS handshake error from 192.168.126.11:53864: no serving certificate available for the kubelet" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.205195 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-5dxkx" podStartSLOduration=97.205179359 podStartE2EDuration="1m37.205179359s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.202514576 +0000 UTC m=+118.743825915" watchObservedRunningTime="2026-03-12 16:52:56.205179359 +0000 UTC m=+118.746490698" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.207976 5184 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-7pgjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 16:52:56 crc kubenswrapper[5184]: [-]has-synced failed: reason withheld Mar 12 16:52:56 crc kubenswrapper[5184]: [+]process-running ok Mar 12 16:52:56 crc kubenswrapper[5184]: healthz check failed Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.208035 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podUID="e1483fd4-8f3f-4326-874c-19e9c796d809" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.235247 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-wll7m" podStartSLOduration=96.235230172 podStartE2EDuration="1m36.235230172s" podCreationTimestamp="2026-03-12 16:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.23422801 +0000 UTC m=+118.775539349" watchObservedRunningTime="2026-03-12 16:52:56.235230172 +0000 UTC m=+118.776541511" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.237651 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.243113 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.743082865 +0000 UTC m=+119.284394214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.274344 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-67c89758df-ssqr6" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.319805 5184 ???:1] "http: TLS handshake error from 192.168.126.11:53880: no serving certificate available for the kubelet" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.339808 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" podStartSLOduration=97.339790685 podStartE2EDuration="1m37.339790685s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.29773474 +0000 UTC m=+118.839046089" watchObservedRunningTime="2026-03-12 16:52:56.339790685 +0000 UTC m=+118.881102024" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.340663 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64d44f6ddf-qxthf" podStartSLOduration=97.340657742 podStartE2EDuration="1m37.340657742s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.332692705 +0000 UTC m=+118.874004054" watchObservedRunningTime="2026-03-12 16:52:56.340657742 +0000 UTC m=+118.881969081" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.342913 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.343307 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.843291094 +0000 UTC m=+119.384602443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.347611 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-6bgzr" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.357978 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" podStartSLOduration=97.357957099 podStartE2EDuration="1m37.357957099s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.354257724 +0000 UTC m=+118.895569073" watchObservedRunningTime="2026-03-12 16:52:56.357957099 +0000 UTC m=+118.899268438" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.405325 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" podStartSLOduration=8.405304227 podStartE2EDuration="8.405304227s" podCreationTimestamp="2026-03-12 16:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.403706217 +0000 UTC m=+118.945017556" watchObservedRunningTime="2026-03-12 16:52:56.405304227 +0000 UTC m=+118.946615576" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.444032 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.444417 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:56.94440125 +0000 UTC m=+119.485712589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.453036 5184 ???:1] "http: TLS handshake error from 192.168.126.11:53882: no serving certificate available for the kubelet" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.540177 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-5vk9f" podStartSLOduration=97.540154521 podStartE2EDuration="1m37.540154521s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:56.510198221 +0000 UTC m=+119.051509560" watchObservedRunningTime="2026-03-12 16:52:56.540154521 +0000 UTC m=+119.081465860" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.545051 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.545416 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.045400273 +0000 UTC m=+119.586711612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.589351 5184 ???:1] "http: TLS handshake error from 192.168.126.11:53898: no serving certificate available for the kubelet" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.646337 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.646518 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.146488169 +0000 UTC m=+119.687799508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.646567 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.646881 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.146872051 +0000 UTC m=+119.688183390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.703143 5184 ???:1] "http: TLS handshake error from 192.168.126.11:53900: no serving certificate available for the kubelet" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.748027 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.748279 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.248248545 +0000 UTC m=+119.789559884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.748437 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.748879 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.248857914 +0000 UTC m=+119.790169303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.800477 5184 ???:1] "http: TLS handshake error from 192.168.126.11:53906: no serving certificate available for the kubelet" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.849586 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.849920 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.349904209 +0000 UTC m=+119.891215548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.924339 5184 ???:1] "http: TLS handshake error from 192.168.126.11:53918: no serving certificate available for the kubelet" Mar 12 16:52:56 crc kubenswrapper[5184]: I0312 16:52:56.951586 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:56 crc kubenswrapper[5184]: E0312 16:52:56.951922 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.451910914 +0000 UTC m=+119.993222253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.053018 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.053339 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.553322739 +0000 UTC m=+120.094634068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.133654 5184 ???:1] "http: TLS handshake error from 192.168.126.11:53930: no serving certificate available for the kubelet" Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.141127 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-l8cgq" event={"ID":"0414ebc4-50d1-4b9e-966a-693c0957a5a5","Type":"ContainerStarted","Data":"bb7d0027f141058832b7461cfb273027a4023f28024e91088b97906d1e91b5c4"} Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.147573 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vxc4c" event={"ID":"024d7c4d-0c1a-45cd-8d3f-fdb57d2af4df","Type":"ContainerStarted","Data":"625f85b3b06f6fa5a6e9ee7a108b72739d95f0c831910a28411c9c930add2abc"} Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.153875 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.153939 5184 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-dpld6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.153982 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" podUID="5ad036a8-381e-4761-a20f-8d8b9a3e9408" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.154103 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.654092005 +0000 UTC m=+120.195403344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.158077 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-jskx5" Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.194470 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vxc4c" podStartSLOduration=98.194452077 podStartE2EDuration="1m38.194452077s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:52:57.187308495 +0000 UTC m=+119.728619834" watchObservedRunningTime="2026-03-12 16:52:57.194452077 +0000 UTC m=+119.735763406" Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.205733 5184 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-7pgjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 16:52:57 crc kubenswrapper[5184]: [-]has-synced failed: reason withheld Mar 12 16:52:57 crc kubenswrapper[5184]: [+]process-running ok Mar 12 16:52:57 crc kubenswrapper[5184]: healthz check failed Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.206031 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podUID="e1483fd4-8f3f-4326-874c-19e9c796d809" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.254846 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.256597 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.756580555 +0000 UTC m=+120.297891894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.358271 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.358561 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.858549098 +0000 UTC m=+120.399860437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.362671 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-fm2vq"] Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.458754 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.458870 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.958843009 +0000 UTC m=+120.500154398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.459174 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.459512 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:57.9595003 +0000 UTC m=+120.500811639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.560636 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.560858 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.060805143 +0000 UTC m=+120.602116482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.561077 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.561649 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.061633239 +0000 UTC m=+120.602944578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.662267 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.662468 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.162435705 +0000 UTC m=+120.703747054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.662801 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.663133 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.163120897 +0000 UTC m=+120.704432236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.763668 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.763998 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.263977016 +0000 UTC m=+120.805288365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.852871 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nk4rr"] Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.865049 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.865578 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.365556466 +0000 UTC m=+120.906867805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.868363 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nk4rr"] Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.868526 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.870754 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.873462 5184 ???:1] "http: TLS handshake error from 192.168.126.11:60622: no serving certificate available for the kubelet" Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.956789 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.966267 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.966423 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-catalog-content\") pod \"community-operators-nk4rr\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.966475 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.466446046 +0000 UTC m=+121.007757385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.966550 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-utilities\") pod \"community-operators-nk4rr\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.966663 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzv8k\" (UniqueName: \"kubernetes.io/projected/ce605906-7727-4682-83a5-e18f9faeb789-kube-api-access-gzv8k\") pod \"community-operators-nk4rr\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:57 crc kubenswrapper[5184]: I0312 16:52:57.966695 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:57 crc kubenswrapper[5184]: E0312 16:52:57.967010 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.467003733 +0000 UTC m=+121.008315072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.067478 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.067764 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzv8k\" (UniqueName: \"kubernetes.io/projected/ce605906-7727-4682-83a5-e18f9faeb789-kube-api-access-gzv8k\") pod \"community-operators-nk4rr\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.067824 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-catalog-content\") pod \"community-operators-nk4rr\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.067858 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-utilities\") pod \"community-operators-nk4rr\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.067913 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.567883983 +0000 UTC m=+121.109195322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.068263 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-utilities\") pod \"community-operators-nk4rr\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.068464 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-catalog-content\") pod \"community-operators-nk4rr\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.105316 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzv8k\" (UniqueName: \"kubernetes.io/projected/ce605906-7727-4682-83a5-e18f9faeb789-kube-api-access-gzv8k\") pod \"community-operators-nk4rr\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.169437 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.169711 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.669699402 +0000 UTC m=+121.211010741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.180507 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.180556 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hwnbt"] Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.181155 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.188738 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.197614 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler\"/\"installer-sa-dockercfg-qpkss\"" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.197884 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler\"/\"kube-root-ca.crt\"" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.200537 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5zvch" event={"ID":"af87b4e5-15c0-48dc-9bc3-df39fcc24a53","Type":"ContainerStarted","Data":"3ffc9a8896e625377c3b5ad2ea90dc3918715242b4cf4b1bc82e80520cd7f688"} Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.200591 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwnbt"] Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.200759 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.202526 5184 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-7pgjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 16:52:58 crc kubenswrapper[5184]: [-]has-synced failed: reason withheld Mar 12 16:52:58 crc kubenswrapper[5184]: [+]process-running ok Mar 12 16:52:58 crc kubenswrapper[5184]: healthz check failed Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.202576 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podUID="e1483fd4-8f3f-4326-874c-19e9c796d809" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.209602 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.273652 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vqqp2"] Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.276959 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.277200 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.777163465 +0000 UTC m=+121.318474814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.277489 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-utilities\") pod \"certified-operators-hwnbt\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.277573 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e66e5048-2e16-4496-a386-f7261e4685ea-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"e66e5048-2e16-4496-a386-f7261e4685ea\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.277650 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dtg\" (UniqueName: \"kubernetes.io/projected/27cbd345-0044-49d8-9192-d193df4c579e-kube-api-access-58dtg\") pod \"certified-operators-hwnbt\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.277996 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-catalog-content\") pod \"certified-operators-hwnbt\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.278352 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.278398 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66e5048-2e16-4496-a386-f7261e4685ea-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"e66e5048-2e16-4496-a386-f7261e4685ea\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.278707 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.778695163 +0000 UTC m=+121.320006582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.285560 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.296040 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqqp2"] Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.406543 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.415559 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.915537748 +0000 UTC m=+121.456849087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.430597 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc5j6\" (UniqueName: \"kubernetes.io/projected/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-kube-api-access-rc5j6\") pod \"community-operators-vqqp2\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.430687 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-utilities\") pod \"certified-operators-hwnbt\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.430724 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e66e5048-2e16-4496-a386-f7261e4685ea-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"e66e5048-2e16-4496-a386-f7261e4685ea\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.430766 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58dtg\" (UniqueName: \"kubernetes.io/projected/27cbd345-0044-49d8-9192-d193df4c579e-kube-api-access-58dtg\") pod \"certified-operators-hwnbt\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.430875 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-catalog-content\") pod \"certified-operators-hwnbt\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.430906 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-catalog-content\") pod \"community-operators-vqqp2\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.430996 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.431019 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66e5048-2e16-4496-a386-f7261e4685ea-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"e66e5048-2e16-4496-a386-f7261e4685ea\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.431041 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-utilities\") pod \"community-operators-vqqp2\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.432331 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:58.932311848 +0000 UTC m=+121.473623187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.432441 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e66e5048-2e16-4496-a386-f7261e4685ea-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"e66e5048-2e16-4496-a386-f7261e4685ea\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.440758 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-utilities\") pod \"certified-operators-hwnbt\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.450353 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-catalog-content\") pod \"certified-operators-hwnbt\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.454682 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler\"/\"kube-root-ca.crt\"" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.492500 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-phh8l"] Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.493010 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66e5048-2e16-4496-a386-f7261e4685ea-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"e66e5048-2e16-4496-a386-f7261e4685ea\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.494975 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dtg\" (UniqueName: \"kubernetes.io/projected/27cbd345-0044-49d8-9192-d193df4c579e-kube-api-access-58dtg\") pod \"certified-operators-hwnbt\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.499801 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler\"/\"installer-sa-dockercfg-qpkss\"" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.505469 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.510886 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phh8l"] Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.511048 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.531655 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.531806 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-catalog-content\") pod \"community-operators-vqqp2\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.532112 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-utilities\") pod \"community-operators-vqqp2\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.532154 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rc5j6\" (UniqueName: \"kubernetes.io/projected/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-kube-api-access-rc5j6\") pod \"community-operators-vqqp2\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.532515 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.032497376 +0000 UTC m=+121.573808715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.532841 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-catalog-content\") pod \"community-operators-vqqp2\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.533074 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-utilities\") pod \"community-operators-vqqp2\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.541865 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.550051 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.592147 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc5j6\" (UniqueName: \"kubernetes.io/projected/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-kube-api-access-rc5j6\") pod \"community-operators-vqqp2\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.600706 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.636527 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-utilities\") pod \"certified-operators-phh8l\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.636603 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-catalog-content\") pod \"certified-operators-phh8l\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.636645 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.636692 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtz7r\" (UniqueName: \"kubernetes.io/projected/57daa144-a296-461d-8a95-4a0266b3a6b8-kube-api-access-mtz7r\") pod \"certified-operators-phh8l\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.637067 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.137047409 +0000 UTC m=+121.678358748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.737322 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.737794 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.237772114 +0000 UTC m=+121.779083463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.737852 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-utilities\") pod \"certified-operators-phh8l\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.737908 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-catalog-content\") pod \"certified-operators-phh8l\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.737943 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.737983 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtz7r\" (UniqueName: \"kubernetes.io/projected/57daa144-a296-461d-8a95-4a0266b3a6b8-kube-api-access-mtz7r\") pod \"certified-operators-phh8l\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.738557 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-utilities\") pod \"certified-operators-phh8l\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.738583 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.238564488 +0000 UTC m=+121.779875817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.738954 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-catalog-content\") pod \"certified-operators-phh8l\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.763206 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtz7r\" (UniqueName: \"kubernetes.io/projected/57daa144-a296-461d-8a95-4a0266b3a6b8-kube-api-access-mtz7r\") pod \"certified-operators-phh8l\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.769181 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nk4rr"] Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.819912 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.842320 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.842485 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.34245585 +0000 UTC m=+121.883767189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.842961 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.843310 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.343292237 +0000 UTC m=+121.884603576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.890816 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.894188 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwnbt"] Mar 12 16:52:58 crc kubenswrapper[5184]: I0312 16:52:58.944493 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:58 crc kubenswrapper[5184]: E0312 16:52:58.945042 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.445026643 +0000 UTC m=+121.986337982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.046031 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.046417 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.546397917 +0000 UTC m=+122.087709286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.144983 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phh8l"] Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.147656 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.148093 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.648072941 +0000 UTC m=+122.189384280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.151600 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqqp2"] Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.176351 5184 ???:1] "http: TLS handshake error from 192.168.126.11:60628: no serving certificate available for the kubelet" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.179624 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"e66e5048-2e16-4496-a386-f7261e4685ea","Type":"ContainerStarted","Data":"f0f5a2111e75e5b5c277d0f65faaaf9ad4a46468b21f0caef1ac7c723a2259b4"} Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.181491 5184 generic.go:358] "Generic (PLEG): container finished" podID="8ea329fb-a095-4645-b64f-a5769efa6364" containerID="95aff8907da74210f0f5111c0984339b383948ad0229faf5ce3eabf1cafca580" exitCode=0 Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.181574 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" event={"ID":"8ea329fb-a095-4645-b64f-a5769efa6364","Type":"ContainerDied","Data":"95aff8907da74210f0f5111c0984339b383948ad0229faf5ce3eabf1cafca580"} Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.183876 5184 generic.go:358] "Generic (PLEG): container finished" podID="ce605906-7727-4682-83a5-e18f9faeb789" containerID="9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf" exitCode=0 Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.184907 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4rr" event={"ID":"ce605906-7727-4682-83a5-e18f9faeb789","Type":"ContainerDied","Data":"9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf"} Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.184934 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4rr" event={"ID":"ce605906-7727-4682-83a5-e18f9faeb789","Type":"ContainerStarted","Data":"e1058d2442fe7b9790b303f5f5adcf5cbc9e0b1f1c0606ef603ca1c53eb51671"} Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.189643 5184 generic.go:358] "Generic (PLEG): container finished" podID="27cbd345-0044-49d8-9192-d193df4c579e" containerID="168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5" exitCode=0 Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.189765 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnbt" event={"ID":"27cbd345-0044-49d8-9192-d193df4c579e","Type":"ContainerDied","Data":"168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5"} Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.189788 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnbt" event={"ID":"27cbd345-0044-49d8-9192-d193df4c579e","Type":"ContainerStarted","Data":"c372d508f0c1a43116cb86b006c3f96f27219d02cc05f5feca0d4f0deb7326fe"} Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.190096 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" podUID="4840f833-3dce-444b-8cad-3a7374af30e7" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" gracePeriod=30 Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.206123 5184 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-7pgjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 16:52:59 crc kubenswrapper[5184]: [-]has-synced failed: reason withheld Mar 12 16:52:59 crc kubenswrapper[5184]: [+]process-running ok Mar 12 16:52:59 crc kubenswrapper[5184]: healthz check failed Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.206187 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podUID="e1483fd4-8f3f-4326-874c-19e9c796d809" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.249446 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.249814 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.749796547 +0000 UTC m=+122.291107886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.351071 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.351243 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.851221163 +0000 UTC m=+122.392532502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.351562 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.351992 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.851979886 +0000 UTC m=+122.393291225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.382402 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.452496 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.452886 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:52:59.952869456 +0000 UTC m=+122.494180795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.531938 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.532964 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.535645 5184 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.539425 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.554259 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.554623 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.054606052 +0000 UTC m=+122.595917401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.655846 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.656923 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.156900445 +0000 UTC m=+122.698211814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.728898 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.729009 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.740470 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.757848 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.760862 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.260839999 +0000 UTC m=+122.802151348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.856203 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h6mx8"] Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.859250 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.859419 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.359364365 +0000 UTC m=+122.900675704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.859650 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.860272 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.360259954 +0000 UTC m=+122.901571293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.892578 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6mx8"] Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.892752 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.901837 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.961903 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.962096 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.462072521 +0000 UTC m=+123.003383860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:52:59 crc kubenswrapper[5184]: I0312 16:52:59.962261 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:52:59 crc kubenswrapper[5184]: E0312 16:52:59.962696 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.46268918 +0000 UTC m=+123.004000519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.063122 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.063252 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sslkr\" (UniqueName: \"kubernetes.io/projected/8df70bc7-e513-41bf-94d8-5f79a9d10b64-kube-api-access-sslkr\") pod \"redhat-marketplace-h6mx8\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.063288 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-utilities\") pod \"redhat-marketplace-h6mx8\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:00 crc kubenswrapper[5184]: E0312 16:53:00.063341 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.563318792 +0000 UTC m=+123.104630131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.063604 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.063694 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-catalog-content\") pod \"redhat-marketplace-h6mx8\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:00 crc kubenswrapper[5184]: E0312 16:53:00.063952 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.563945372 +0000 UTC m=+123.105256711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.155280 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-5777786469-n9g8v" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.165198 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.165466 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-catalog-content\") pod \"redhat-marketplace-h6mx8\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.165493 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sslkr\" (UniqueName: \"kubernetes.io/projected/8df70bc7-e513-41bf-94d8-5f79a9d10b64-kube-api-access-sslkr\") pod \"redhat-marketplace-h6mx8\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.165525 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-utilities\") pod \"redhat-marketplace-h6mx8\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:00 crc kubenswrapper[5184]: E0312 16:53:00.166062 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.666033608 +0000 UTC m=+123.207344937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.166199 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-utilities\") pod \"redhat-marketplace-h6mx8\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.167126 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-catalog-content\") pod \"redhat-marketplace-h6mx8\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.222508 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sslkr\" (UniqueName: \"kubernetes.io/projected/8df70bc7-e513-41bf-94d8-5f79a9d10b64-kube-api-access-sslkr\") pod \"redhat-marketplace-h6mx8\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.223225 5184 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-7pgjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 16:53:00 crc kubenswrapper[5184]: [-]has-synced failed: reason withheld Mar 12 16:53:00 crc kubenswrapper[5184]: [+]process-running ok Mar 12 16:53:00 crc kubenswrapper[5184]: healthz check failed Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.223277 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podUID="e1483fd4-8f3f-4326-874c-19e9c796d809" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.234984 5184 generic.go:358] "Generic (PLEG): container finished" podID="e66e5048-2e16-4496-a386-f7261e4685ea" containerID="158191ebe0ae6ec6eb42d348ffc6c90738b09007217267cccfe792a3038c5f32" exitCode=0 Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.235091 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"e66e5048-2e16-4496-a386-f7261e4685ea","Type":"ContainerDied","Data":"158191ebe0ae6ec6eb42d348ffc6c90738b09007217267cccfe792a3038c5f32"} Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.245406 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5zvch" event={"ID":"af87b4e5-15c0-48dc-9bc3-df39fcc24a53","Type":"ContainerStarted","Data":"64c296c242ec7ed901a81bba4ba0fe49afb7d59af1c159cea697cb60f943942b"} Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.245459 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5zvch" event={"ID":"af87b4e5-15c0-48dc-9bc3-df39fcc24a53","Type":"ContainerStarted","Data":"93db0539964e9dac32daf0421aad8a7a2ed5b9e0328992cb7edb7da1bbe49c5d"} Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.245469 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5zvch" event={"ID":"af87b4e5-15c0-48dc-9bc3-df39fcc24a53","Type":"ContainerStarted","Data":"1033b0f9ae948b58d2ba68d59b04bc49d54086a7c8b43cb2af6200c318152da8"} Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.247353 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zt24v"] Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.249037 5184 generic.go:358] "Generic (PLEG): container finished" podID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerID="b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048" exitCode=0 Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.253041 5184 generic.go:358] "Generic (PLEG): container finished" podID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerID="8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4" exitCode=0 Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.260151 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqqp2" event={"ID":"d91a31bb-cc85-4866-bb81-a3e9b0cc9362","Type":"ContainerDied","Data":"b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048"} Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.260283 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqqp2" event={"ID":"d91a31bb-cc85-4866-bb81-a3e9b0cc9362","Type":"ContainerStarted","Data":"3027bc4bf9533295fde235c2075bfb962e84b9d71304eb80da844b84c486fab2"} Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.260296 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phh8l" event={"ID":"57daa144-a296-461d-8a95-4a0266b3a6b8","Type":"ContainerDied","Data":"8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4"} Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.260307 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phh8l" event={"ID":"57daa144-a296-461d-8a95-4a0266b3a6b8","Type":"ContainerStarted","Data":"a6c957889346f91943a60fd0888203d11ea4dcf7ca80b7eb4f7d22f3d9881ceb"} Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.260758 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zt24v"] Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.261738 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.267654 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:53:00 crc kubenswrapper[5184]: E0312 16:53:00.268031 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.768014742 +0000 UTC m=+123.309326081 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.270612 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-9ddfb9f55-j4gpx" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.272468 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-8596bd845d-ndv6q" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.272856 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5zvch" podStartSLOduration=13.272845121 podStartE2EDuration="13.272845121s" podCreationTimestamp="2026-03-12 16:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:53:00.270502759 +0000 UTC m=+122.811814098" watchObservedRunningTime="2026-03-12 16:53:00.272845121 +0000 UTC m=+122.814156460" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.368729 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.368959 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-utilities\") pod \"redhat-marketplace-zt24v\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.369071 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7dpt\" (UniqueName: \"kubernetes.io/projected/4ed8bc0c-5406-4893-9949-1342c8eb210c-kube-api-access-n7dpt\") pod \"redhat-marketplace-zt24v\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.369189 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-catalog-content\") pod \"redhat-marketplace-zt24v\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: E0312 16:53:00.372367 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.872344028 +0000 UTC m=+123.413655367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.473108 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-catalog-content\") pod \"redhat-marketplace-zt24v\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.473182 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-utilities\") pod \"redhat-marketplace-zt24v\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.473225 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.473245 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dpt\" (UniqueName: \"kubernetes.io/projected/4ed8bc0c-5406-4893-9949-1342c8eb210c-kube-api-access-n7dpt\") pod \"redhat-marketplace-zt24v\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.473733 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-utilities\") pod \"redhat-marketplace-zt24v\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: E0312 16:53:00.474274 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-12 16:53:00.974259089 +0000 UTC m=+123.515570428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-dlsx9" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.474624 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-catalog-content\") pod \"redhat-marketplace-zt24v\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.505483 5184 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-12T16:52:59.535673565Z","UUID":"46e09bb0-f3f0-4f9f-871d-55cafdc13fb5","Handler":null,"Name":"","Endpoint":""} Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.514236 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7dpt\" (UniqueName: \"kubernetes.io/projected/4ed8bc0c-5406-4893-9949-1342c8eb210c-kube-api-access-n7dpt\") pod \"redhat-marketplace-zt24v\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.518645 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.529601 5184 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.529636 5184 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.577077 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.593929 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.611563 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (OuterVolumeSpecName: "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2". PluginName "kubernetes.io/csi", VolumeGIDValue "" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.644495 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.678334 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.697570 5184 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.697609 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1264ac67579ad07e7e9003054d44fe40dd55285a4b2f7dc74e48be1aee0868a/globalmount\"" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.780429 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zglg7\" (UniqueName: \"kubernetes.io/projected/8ea329fb-a095-4645-b64f-a5769efa6364-kube-api-access-zglg7\") pod \"8ea329fb-a095-4645-b64f-a5769efa6364\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.780532 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ea329fb-a095-4645-b64f-a5769efa6364-config-volume\") pod \"8ea329fb-a095-4645-b64f-a5769efa6364\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.780603 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ea329fb-a095-4645-b64f-a5769efa6364-secret-volume\") pod \"8ea329fb-a095-4645-b64f-a5769efa6364\" (UID: \"8ea329fb-a095-4645-b64f-a5769efa6364\") " Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.784583 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ea329fb-a095-4645-b64f-a5769efa6364-config-volume" (OuterVolumeSpecName: "config-volume") pod "8ea329fb-a095-4645-b64f-a5769efa6364" (UID: "8ea329fb-a095-4645-b64f-a5769efa6364"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.788500 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea329fb-a095-4645-b64f-a5769efa6364-kube-api-access-zglg7" (OuterVolumeSpecName: "kube-api-access-zglg7") pod "8ea329fb-a095-4645-b64f-a5769efa6364" (UID: "8ea329fb-a095-4645-b64f-a5769efa6364"). InnerVolumeSpecName "kube-api-access-zglg7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.790897 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea329fb-a095-4645-b64f-a5769efa6364-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8ea329fb-a095-4645-b64f-a5769efa6364" (UID: "8ea329fb-a095-4645-b64f-a5769efa6364"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.795609 5184 patch_prober.go:28] interesting pod/downloads-747b44746d-t6987 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.795665 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-747b44746d-t6987" podUID="6f45ff33-e60b-4885-ac63-5ab182bf6320" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.846128 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-dlsx9\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.882536 5184 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ea329fb-a095-4645-b64f-a5769efa6364-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.882576 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zglg7\" (UniqueName: \"kubernetes.io/projected/8ea329fb-a095-4645-b64f-a5769efa6364-kube-api-access-zglg7\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.882587 5184 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ea329fb-a095-4645-b64f-a5769efa6364-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.919534 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.920294 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ea329fb-a095-4645-b64f-a5769efa6364" containerName="collect-profiles" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.920316 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea329fb-a095-4645-b64f-a5769efa6364" containerName="collect-profiles" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.920446 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ea329fb-a095-4645-b64f-a5769efa6364" containerName="collect-profiles" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.931738 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6mx8"] Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.931901 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.937058 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.937601 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.937881 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Mar 12 16:53:00 crc kubenswrapper[5184]: I0312 16:53:00.988066 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zt24v"] Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.003305 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.009444 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.056389 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cx6mk"] Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.071740 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cx6mk"] Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.071891 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.073987 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.087976 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1828fda1-8f85-4c8f-af39-910678dfe27f-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"1828fda1-8f85-4c8f-af39-910678dfe27f\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.088036 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1828fda1-8f85-4c8f-af39-910678dfe27f-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"1828fda1-8f85-4c8f-af39-910678dfe27f\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.189062 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1828fda1-8f85-4c8f-af39-910678dfe27f-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"1828fda1-8f85-4c8f-af39-910678dfe27f\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.189123 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv8bf\" (UniqueName: \"kubernetes.io/projected/45fdd519-1630-4afa-9780-7325691d8206-kube-api-access-qv8bf\") pod \"redhat-operators-cx6mk\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.189156 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-catalog-content\") pod \"redhat-operators-cx6mk\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.189188 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-utilities\") pod \"redhat-operators-cx6mk\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.189206 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1828fda1-8f85-4c8f-af39-910678dfe27f-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"1828fda1-8f85-4c8f-af39-910678dfe27f\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.189301 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1828fda1-8f85-4c8f-af39-910678dfe27f-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"1828fda1-8f85-4c8f-af39-910678dfe27f\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.200677 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.206080 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1828fda1-8f85-4c8f-af39-910678dfe27f-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"1828fda1-8f85-4c8f-af39-910678dfe27f\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.206258 5184 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-7pgjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 16:53:01 crc kubenswrapper[5184]: [-]has-synced failed: reason withheld Mar 12 16:53:01 crc kubenswrapper[5184]: [+]process-running ok Mar 12 16:53:01 crc kubenswrapper[5184]: healthz check failed Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.206282 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podUID="e1483fd4-8f3f-4326-874c-19e9c796d809" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.265247 5184 generic.go:358] "Generic (PLEG): container finished" podID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerID="9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737" exitCode=0 Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.265598 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zt24v" event={"ID":"4ed8bc0c-5406-4893-9949-1342c8eb210c","Type":"ContainerDied","Data":"9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737"} Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.265625 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zt24v" event={"ID":"4ed8bc0c-5406-4893-9949-1342c8eb210c","Type":"ContainerStarted","Data":"1ba617cd16c7237854758e169428ac6bf1949387a43cc08a6b9f2694ce8716f7"} Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.271366 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.279526 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.280541 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz" event={"ID":"8ea329fb-a095-4645-b64f-a5769efa6364","Type":"ContainerDied","Data":"391b81a450cd67b5be9aa3f9a77494b116252263267e2b626b6d0ae0e8dd9045"} Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.280607 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="391b81a450cd67b5be9aa3f9a77494b116252263267e2b626b6d0ae0e8dd9045" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.290065 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qv8bf\" (UniqueName: \"kubernetes.io/projected/45fdd519-1630-4afa-9780-7325691d8206-kube-api-access-qv8bf\") pod \"redhat-operators-cx6mk\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.290120 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-catalog-content\") pod \"redhat-operators-cx6mk\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.290162 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-utilities\") pod \"redhat-operators-cx6mk\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.290586 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-utilities\") pod \"redhat-operators-cx6mk\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.292059 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6mx8" event={"ID":"8df70bc7-e513-41bf-94d8-5f79a9d10b64","Type":"ContainerStarted","Data":"a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403"} Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.292079 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6mx8" event={"ID":"8df70bc7-e513-41bf-94d8-5f79a9d10b64","Type":"ContainerStarted","Data":"05937773a3d9af59d71300ee09b09ae8c3378a952416db356f2fd084a0384e70"} Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.336262 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-catalog-content\") pod \"redhat-operators-cx6mk\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.376543 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv8bf\" (UniqueName: \"kubernetes.io/projected/45fdd519-1630-4afa-9780-7325691d8206-kube-api-access-qv8bf\") pod \"redhat-operators-cx6mk\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.380417 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-dlsx9"] Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.398776 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:01 crc kubenswrapper[5184]: W0312 16:53:01.422181 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82e2099d_a6d8_488e_8144_b2ed728725e2.slice/crio-7da9fa4af14f36d725116a220038f3598cc50c5c0345f1430e6a864cfecd920e WatchSource:0}: Error finding container 7da9fa4af14f36d725116a220038f3598cc50c5c0345f1430e6a864cfecd920e: Status 404 returned error can't find the container with id 7da9fa4af14f36d725116a220038f3598cc50c5c0345f1430e6a864cfecd920e Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.459022 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zhkzs"] Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.469090 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.475583 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zhkzs"] Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.598958 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nmkf\" (UniqueName: \"kubernetes.io/projected/f2536f10-73e6-480a-9abb-2fd7a7e1a235-kube-api-access-5nmkf\") pod \"redhat-operators-zhkzs\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.599030 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-utilities\") pod \"redhat-operators-zhkzs\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.599125 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-catalog-content\") pod \"redhat-operators-zhkzs\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.661259 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.677020 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 12 16:53:01 crc kubenswrapper[5184]: W0312 16:53:01.679054 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1828fda1_8f85_4c8f_af39_910678dfe27f.slice/crio-c7d74604b29388641366cb7c7766fc1b598894c8e4843d9a6568faa0d5c1cbd1 WatchSource:0}: Error finding container c7d74604b29388641366cb7c7766fc1b598894c8e4843d9a6568faa0d5c1cbd1: Status 404 returned error can't find the container with id c7d74604b29388641366cb7c7766fc1b598894c8e4843d9a6568faa0d5c1cbd1 Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.703064 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-utilities\") pod \"redhat-operators-zhkzs\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.703153 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-catalog-content\") pod \"redhat-operators-zhkzs\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.703183 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nmkf\" (UniqueName: \"kubernetes.io/projected/f2536f10-73e6-480a-9abb-2fd7a7e1a235-kube-api-access-5nmkf\") pod \"redhat-operators-zhkzs\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.703935 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-utilities\") pod \"redhat-operators-zhkzs\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.704168 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-catalog-content\") pod \"redhat-operators-zhkzs\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.717733 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.723451 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.721456 5184 patch_prober.go:28] interesting pod/console-64d44f6ddf-qxthf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.41:8443/health\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.723729 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-64d44f6ddf-qxthf" podUID="c42a2703-d32e-41a7-accf-68b6e5d8c000" containerName="console" probeResult="failure" output="Get \"https://10.217.0.41:8443/health\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.730075 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nmkf\" (UniqueName: \"kubernetes.io/projected/f2536f10-73e6-480a-9abb-2fd7a7e1a235-kube-api-access-5nmkf\") pod \"redhat-operators-zhkzs\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.770634 5184 ???:1] "http: TLS handshake error from 192.168.126.11:60630: no serving certificate available for the kubelet" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.806906 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e66e5048-2e16-4496-a386-f7261e4685ea-kubelet-dir\") pod \"e66e5048-2e16-4496-a386-f7261e4685ea\" (UID: \"e66e5048-2e16-4496-a386-f7261e4685ea\") " Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.807020 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e66e5048-2e16-4496-a386-f7261e4685ea-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e66e5048-2e16-4496-a386-f7261e4685ea" (UID: "e66e5048-2e16-4496-a386-f7261e4685ea"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.807192 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66e5048-2e16-4496-a386-f7261e4685ea-kube-api-access\") pod \"e66e5048-2e16-4496-a386-f7261e4685ea\" (UID: \"e66e5048-2e16-4496-a386-f7261e4685ea\") " Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.807880 5184 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e66e5048-2e16-4496-a386-f7261e4685ea-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.821557 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66e5048-2e16-4496-a386-f7261e4685ea-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e66e5048-2e16-4496-a386-f7261e4685ea" (UID: "e66e5048-2e16-4496-a386-f7261e4685ea"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.832786 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:01 crc kubenswrapper[5184]: I0312 16:53:01.908916 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66e5048-2e16-4496-a386-f7261e4685ea-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.010942 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cx6mk"] Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.081868 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zhkzs"] Mar 12 16:53:02 crc kubenswrapper[5184]: W0312 16:53:02.114484 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2536f10_73e6_480a_9abb_2fd7a7e1a235.slice/crio-caeb2e5fe1e893a88da6f83e36df6a85da9fa914f65edaf21a6de81d965e659f WatchSource:0}: Error finding container caeb2e5fe1e893a88da6f83e36df6a85da9fa914f65edaf21a6de81d965e659f: Status 404 returned error can't find the container with id caeb2e5fe1e893a88da6f83e36df6a85da9fa914f65edaf21a6de81d965e659f Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.202548 5184 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-7pgjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 16:53:02 crc kubenswrapper[5184]: [-]has-synced failed: reason withheld Mar 12 16:53:02 crc kubenswrapper[5184]: [+]process-running ok Mar 12 16:53:02 crc kubenswrapper[5184]: healthz check failed Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.202630 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podUID="e1483fd4-8f3f-4326-874c-19e9c796d809" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.301023 5184 generic.go:358] "Generic (PLEG): container finished" podID="45fdd519-1630-4afa-9780-7325691d8206" containerID="4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d" exitCode=0 Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.301161 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx6mk" event={"ID":"45fdd519-1630-4afa-9780-7325691d8206","Type":"ContainerDied","Data":"4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d"} Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.301187 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx6mk" event={"ID":"45fdd519-1630-4afa-9780-7325691d8206","Type":"ContainerStarted","Data":"4ea01e3d73dec474a2b62419f36d08c7bdd9b848d9d785f3349dbe9490ad3a91"} Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.306424 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6mx8" event={"ID":"8df70bc7-e513-41bf-94d8-5f79a9d10b64","Type":"ContainerDied","Data":"a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403"} Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.306254 5184 generic.go:358] "Generic (PLEG): container finished" podID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerID="a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403" exitCode=0 Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.315350 5184 generic.go:358] "Generic (PLEG): container finished" podID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerID="bbd73e6b8e94baf2d0c945fb35b4bd2fd4f2e636ad902ebc523e1d05cfb10d5f" exitCode=0 Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.315484 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhkzs" event={"ID":"f2536f10-73e6-480a-9abb-2fd7a7e1a235","Type":"ContainerDied","Data":"bbd73e6b8e94baf2d0c945fb35b4bd2fd4f2e636ad902ebc523e1d05cfb10d5f"} Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.315535 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhkzs" event={"ID":"f2536f10-73e6-480a-9abb-2fd7a7e1a235","Type":"ContainerStarted","Data":"caeb2e5fe1e893a88da6f83e36df6a85da9fa914f65edaf21a6de81d965e659f"} Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.317558 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" event={"ID":"82e2099d-a6d8-488e-8144-b2ed728725e2","Type":"ContainerStarted","Data":"e32c6befe877e79950bc18dd6b9d090f73d6591d5932b62afd6e308cc060228f"} Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.317594 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" event={"ID":"82e2099d-a6d8-488e-8144-b2ed728725e2","Type":"ContainerStarted","Data":"7da9fa4af14f36d725116a220038f3598cc50c5c0345f1430e6a864cfecd920e"} Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.318049 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.319622 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"1828fda1-8f85-4c8f-af39-910678dfe27f","Type":"ContainerStarted","Data":"4f51492face22e3a0f25fd17e648a0b417fd7e8a4f8213482103bfedb6d97f36"} Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.319652 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"1828fda1-8f85-4c8f-af39-910678dfe27f","Type":"ContainerStarted","Data":"c7d74604b29388641366cb7c7766fc1b598894c8e4843d9a6568faa0d5c1cbd1"} Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.327251 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.327340 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"e66e5048-2e16-4496-a386-f7261e4685ea","Type":"ContainerDied","Data":"f0f5a2111e75e5b5c277d0f65faaaf9ad4a46468b21f0caef1ac7c723a2259b4"} Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.327459 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0f5a2111e75e5b5c277d0f65faaaf9ad4a46468b21f0caef1ac7c723a2259b4" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.363622 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-11-crc" podStartSLOduration=2.363603638 podStartE2EDuration="2.363603638s" podCreationTimestamp="2026-03-12 16:53:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:53:02.35111227 +0000 UTC m=+124.892423619" watchObservedRunningTime="2026-03-12 16:53:02.363603638 +0000 UTC m=+124.904914977" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.390857 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" podStartSLOduration=103.390837623 podStartE2EDuration="1m43.390837623s" podCreationTimestamp="2026-03-12 16:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:53:02.389331185 +0000 UTC m=+124.930642554" watchObservedRunningTime="2026-03-12 16:53:02.390837623 +0000 UTC m=+124.932148962" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.413186 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e9b5059-1b3e-4067-a63d-2952cbe863af" path="/var/lib/kubelet/pods/9e9b5059-1b3e-4067-a63d-2952cbe863af/volumes" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.782486 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.782526 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.794984 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:53:02 crc kubenswrapper[5184]: I0312 16:53:02.796030 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:53:03 crc kubenswrapper[5184]: I0312 16:53:03.202287 5184 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-7pgjs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 16:53:03 crc kubenswrapper[5184]: [-]has-synced failed: reason withheld Mar 12 16:53:03 crc kubenswrapper[5184]: [+]process-running ok Mar 12 16:53:03 crc kubenswrapper[5184]: healthz check failed Mar 12 16:53:03 crc kubenswrapper[5184]: I0312 16:53:03.202345 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" podUID="e1483fd4-8f3f-4326-874c-19e9c796d809" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 16:53:03 crc kubenswrapper[5184]: I0312 16:53:03.335409 5184 generic.go:358] "Generic (PLEG): container finished" podID="1828fda1-8f85-4c8f-af39-910678dfe27f" containerID="4f51492face22e3a0f25fd17e648a0b417fd7e8a4f8213482103bfedb6d97f36" exitCode=0 Mar 12 16:53:03 crc kubenswrapper[5184]: I0312 16:53:03.336514 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"1828fda1-8f85-4c8f-af39-910678dfe27f","Type":"ContainerDied","Data":"4f51492face22e3a0f25fd17e648a0b417fd7e8a4f8213482103bfedb6d97f36"} Mar 12 16:53:04 crc kubenswrapper[5184]: I0312 16:53:04.187476 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-f2fdq" Mar 12 16:53:04 crc kubenswrapper[5184]: I0312 16:53:04.203202 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:53:04 crc kubenswrapper[5184]: I0312 16:53:04.207110 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-68cf44c8b8-7pgjs" Mar 12 16:53:04 crc kubenswrapper[5184]: I0312 16:53:04.710544 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 12 16:53:04 crc kubenswrapper[5184]: I0312 16:53:04.854357 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1828fda1-8f85-4c8f-af39-910678dfe27f-kubelet-dir\") pod \"1828fda1-8f85-4c8f-af39-910678dfe27f\" (UID: \"1828fda1-8f85-4c8f-af39-910678dfe27f\") " Mar 12 16:53:04 crc kubenswrapper[5184]: I0312 16:53:04.855027 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1828fda1-8f85-4c8f-af39-910678dfe27f-kube-api-access\") pod \"1828fda1-8f85-4c8f-af39-910678dfe27f\" (UID: \"1828fda1-8f85-4c8f-af39-910678dfe27f\") " Mar 12 16:53:04 crc kubenswrapper[5184]: I0312 16:53:04.854487 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1828fda1-8f85-4c8f-af39-910678dfe27f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1828fda1-8f85-4c8f-af39-910678dfe27f" (UID: "1828fda1-8f85-4c8f-af39-910678dfe27f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:53:04 crc kubenswrapper[5184]: I0312 16:53:04.855278 5184 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1828fda1-8f85-4c8f-af39-910678dfe27f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:04 crc kubenswrapper[5184]: I0312 16:53:04.874329 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1828fda1-8f85-4c8f-af39-910678dfe27f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1828fda1-8f85-4c8f-af39-910678dfe27f" (UID: "1828fda1-8f85-4c8f-af39-910678dfe27f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:04 crc kubenswrapper[5184]: I0312 16:53:04.956728 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1828fda1-8f85-4c8f-af39-910678dfe27f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:05 crc kubenswrapper[5184]: I0312 16:53:05.363951 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"1828fda1-8f85-4c8f-af39-910678dfe27f","Type":"ContainerDied","Data":"c7d74604b29388641366cb7c7766fc1b598894c8e4843d9a6568faa0d5c1cbd1"} Mar 12 16:53:05 crc kubenswrapper[5184]: I0312 16:53:05.363985 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 12 16:53:05 crc kubenswrapper[5184]: I0312 16:53:05.363994 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7d74604b29388641366cb7c7766fc1b598894c8e4843d9a6568faa0d5c1cbd1" Mar 12 16:53:06 crc kubenswrapper[5184]: E0312 16:53:06.037066 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 16:53:06 crc kubenswrapper[5184]: E0312 16:53:06.038211 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 16:53:06 crc kubenswrapper[5184]: E0312 16:53:06.039878 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 16:53:06 crc kubenswrapper[5184]: E0312 16:53:06.039908 5184 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" podUID="4840f833-3dce-444b-8cad-3a7374af30e7" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Mar 12 16:53:06 crc kubenswrapper[5184]: I0312 16:53:06.149529 5184 patch_prober.go:28] interesting pod/downloads-747b44746d-t6987 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 16:53:06 crc kubenswrapper[5184]: I0312 16:53:06.149651 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-t6987" podUID="6f45ff33-e60b-4885-ac63-5ab182bf6320" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 16:53:06 crc kubenswrapper[5184]: I0312 16:53:06.916278 5184 ???:1] "http: TLS handshake error from 192.168.126.11:60634: no serving certificate available for the kubelet" Mar 12 16:53:07 crc kubenswrapper[5184]: I0312 16:53:07.156399 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:53:11 crc kubenswrapper[5184]: I0312 16:53:11.725444 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:53:11 crc kubenswrapper[5184]: I0312 16:53:11.737154 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 16:53:14 crc kubenswrapper[5184]: I0312 16:53:14.371931 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 16:53:16 crc kubenswrapper[5184]: E0312 16:53:16.037025 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 16:53:16 crc kubenswrapper[5184]: E0312 16:53:16.039390 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 16:53:16 crc kubenswrapper[5184]: E0312 16:53:16.041292 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 16:53:16 crc kubenswrapper[5184]: E0312 16:53:16.041407 5184 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" podUID="4840f833-3dce-444b-8cad-3a7374af30e7" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Mar 12 16:53:16 crc kubenswrapper[5184]: I0312 16:53:16.159305 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-747b44746d-t6987" Mar 12 16:53:16 crc kubenswrapper[5184]: I0312 16:53:16.771906 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-bbnrv"] Mar 12 16:53:16 crc kubenswrapper[5184]: I0312 16:53:16.772655 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" podUID="17eed63d-a9fc-414e-9c70-347b51893cfa" containerName="controller-manager" containerID="cri-o://e24cc4e11b3d0234458860254c11bc775f91db3fdc483a595a9056e5efcf156e" gracePeriod=30 Mar 12 16:53:16 crc kubenswrapper[5184]: I0312 16:53:16.809477 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d"] Mar 12 16:53:16 crc kubenswrapper[5184]: I0312 16:53:16.809938 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" podUID="a1d9df18-d5a1-447d-ad5a-fdef055a830a" containerName="route-controller-manager" containerID="cri-o://818f2fbc5e6792c0f16efebf57cf83caa76a17d46ab3de37a9b0a5f5ff767e83" gracePeriod=30 Mar 12 16:53:17 crc kubenswrapper[5184]: I0312 16:53:17.184289 5184 ???:1] "http: TLS handshake error from 192.168.126.11:55904: no serving certificate available for the kubelet" Mar 12 16:53:17 crc kubenswrapper[5184]: I0312 16:53:17.431164 5184 generic.go:358] "Generic (PLEG): container finished" podID="a1d9df18-d5a1-447d-ad5a-fdef055a830a" containerID="818f2fbc5e6792c0f16efebf57cf83caa76a17d46ab3de37a9b0a5f5ff767e83" exitCode=0 Mar 12 16:53:17 crc kubenswrapper[5184]: I0312 16:53:17.431233 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" event={"ID":"a1d9df18-d5a1-447d-ad5a-fdef055a830a","Type":"ContainerDied","Data":"818f2fbc5e6792c0f16efebf57cf83caa76a17d46ab3de37a9b0a5f5ff767e83"} Mar 12 16:53:17 crc kubenswrapper[5184]: I0312 16:53:17.434052 5184 generic.go:358] "Generic (PLEG): container finished" podID="17eed63d-a9fc-414e-9c70-347b51893cfa" containerID="e24cc4e11b3d0234458860254c11bc775f91db3fdc483a595a9056e5efcf156e" exitCode=0 Mar 12 16:53:17 crc kubenswrapper[5184]: I0312 16:53:17.434110 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" event={"ID":"17eed63d-a9fc-414e-9c70-347b51893cfa","Type":"ContainerDied","Data":"e24cc4e11b3d0234458860254c11bc775f91db3fdc483a595a9056e5efcf156e"} Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.905204 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.947071 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960173 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-565b84484f-wf8fr"] Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960742 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1d9df18-d5a1-447d-ad5a-fdef055a830a" containerName="route-controller-manager" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960759 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d9df18-d5a1-447d-ad5a-fdef055a830a" containerName="route-controller-manager" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960773 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17eed63d-a9fc-414e-9c70-347b51893cfa" containerName="controller-manager" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960778 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="17eed63d-a9fc-414e-9c70-347b51893cfa" containerName="controller-manager" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960795 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1828fda1-8f85-4c8f-af39-910678dfe27f" containerName="pruner" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960801 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="1828fda1-8f85-4c8f-af39-910678dfe27f" containerName="pruner" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960813 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e66e5048-2e16-4496-a386-f7261e4685ea" containerName="pruner" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960817 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66e5048-2e16-4496-a386-f7261e4685ea" containerName="pruner" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960904 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="17eed63d-a9fc-414e-9c70-347b51893cfa" containerName="controller-manager" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960915 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="1828fda1-8f85-4c8f-af39-910678dfe27f" containerName="pruner" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960925 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="e66e5048-2e16-4496-a386-f7261e4685ea" containerName="pruner" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.960936 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1d9df18-d5a1-447d-ad5a-fdef055a830a" containerName="route-controller-manager" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.980651 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-565b84484f-wf8fr"] Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.980813 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:20 crc kubenswrapper[5184]: I0312 16:53:20.990828 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz"] Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.037750 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-client-ca\") pod \"17eed63d-a9fc-414e-9c70-347b51893cfa\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.038694 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-client-ca\") pod \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.038755 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-config\") pod \"17eed63d-a9fc-414e-9c70-347b51893cfa\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.038804 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-proxy-ca-bundles\") pod \"17eed63d-a9fc-414e-9c70-347b51893cfa\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.038921 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1d9df18-d5a1-447d-ad5a-fdef055a830a-serving-cert\") pod \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039195 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17eed63d-a9fc-414e-9c70-347b51893cfa-tmp\") pod \"17eed63d-a9fc-414e-9c70-347b51893cfa\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039258 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrbv2\" (UniqueName: \"kubernetes.io/projected/a1d9df18-d5a1-447d-ad5a-fdef055a830a-kube-api-access-zrbv2\") pod \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039287 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17eed63d-a9fc-414e-9c70-347b51893cfa-serving-cert\") pod \"17eed63d-a9fc-414e-9c70-347b51893cfa\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039351 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1d9df18-d5a1-447d-ad5a-fdef055a830a-tmp\") pod \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039436 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-config\") pod \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\" (UID: \"a1d9df18-d5a1-447d-ad5a-fdef055a830a\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039491 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mfvq\" (UniqueName: \"kubernetes.io/projected/17eed63d-a9fc-414e-9c70-347b51893cfa-kube-api-access-7mfvq\") pod \"17eed63d-a9fc-414e-9c70-347b51893cfa\" (UID: \"17eed63d-a9fc-414e-9c70-347b51893cfa\") " Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039647 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-proxy-ca-bundles\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039714 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fbw7\" (UniqueName: \"kubernetes.io/projected/3e775526-099c-4134-b128-2af393f0b3e9-kube-api-access-9fbw7\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039782 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-client-ca\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039802 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e775526-099c-4134-b128-2af393f0b3e9-serving-cert\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039845 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e775526-099c-4134-b128-2af393f0b3e9-tmp\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.039886 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-config\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.041815 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1d9df18-d5a1-447d-ad5a-fdef055a830a-tmp" (OuterVolumeSpecName: "tmp") pod "a1d9df18-d5a1-447d-ad5a-fdef055a830a" (UID: "a1d9df18-d5a1-447d-ad5a-fdef055a830a"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.049323 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17eed63d-a9fc-414e-9c70-347b51893cfa-tmp" (OuterVolumeSpecName: "tmp") pod "17eed63d-a9fc-414e-9c70-347b51893cfa" (UID: "17eed63d-a9fc-414e-9c70-347b51893cfa"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.050110 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-config" (OuterVolumeSpecName: "config") pod "17eed63d-a9fc-414e-9c70-347b51893cfa" (UID: "17eed63d-a9fc-414e-9c70-347b51893cfa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.050213 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d9df18-d5a1-447d-ad5a-fdef055a830a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a1d9df18-d5a1-447d-ad5a-fdef055a830a" (UID: "a1d9df18-d5a1-447d-ad5a-fdef055a830a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.051259 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-client-ca" (OuterVolumeSpecName: "client-ca") pod "17eed63d-a9fc-414e-9c70-347b51893cfa" (UID: "17eed63d-a9fc-414e-9c70-347b51893cfa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.051440 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "17eed63d-a9fc-414e-9c70-347b51893cfa" (UID: "17eed63d-a9fc-414e-9c70-347b51893cfa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.051581 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-client-ca" (OuterVolumeSpecName: "client-ca") pod "a1d9df18-d5a1-447d-ad5a-fdef055a830a" (UID: "a1d9df18-d5a1-447d-ad5a-fdef055a830a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.051898 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-config" (OuterVolumeSpecName: "config") pod "a1d9df18-d5a1-447d-ad5a-fdef055a830a" (UID: "a1d9df18-d5a1-447d-ad5a-fdef055a830a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.058790 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17eed63d-a9fc-414e-9c70-347b51893cfa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "17eed63d-a9fc-414e-9c70-347b51893cfa" (UID: "17eed63d-a9fc-414e-9c70-347b51893cfa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.059126 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d9df18-d5a1-447d-ad5a-fdef055a830a-kube-api-access-zrbv2" (OuterVolumeSpecName: "kube-api-access-zrbv2") pod "a1d9df18-d5a1-447d-ad5a-fdef055a830a" (UID: "a1d9df18-d5a1-447d-ad5a-fdef055a830a"). InnerVolumeSpecName "kube-api-access-zrbv2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.059497 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17eed63d-a9fc-414e-9c70-347b51893cfa-kube-api-access-7mfvq" (OuterVolumeSpecName: "kube-api-access-7mfvq") pod "17eed63d-a9fc-414e-9c70-347b51893cfa" (UID: "17eed63d-a9fc-414e-9c70-347b51893cfa"). InnerVolumeSpecName "kube-api-access-7mfvq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.116130 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz"] Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.116785 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.141785 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb07ce8-1cf0-4b3f-a935-64190b256410-serving-cert\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.143097 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-proxy-ca-bundles\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.143320 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cb07ce8-1cf0-4b3f-a935-64190b256410-tmp\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.143497 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-config\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.144459 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-client-ca\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.144597 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fbw7\" (UniqueName: \"kubernetes.io/projected/3e775526-099c-4134-b128-2af393f0b3e9-kube-api-access-9fbw7\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.144760 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7fj8\" (UniqueName: \"kubernetes.io/projected/2cb07ce8-1cf0-4b3f-a935-64190b256410-kube-api-access-t7fj8\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.144975 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-client-ca\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.145106 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e775526-099c-4134-b128-2af393f0b3e9-serving-cert\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.145277 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e775526-099c-4134-b128-2af393f0b3e9-tmp\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.145434 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-config\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.145702 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1d9df18-d5a1-447d-ad5a-fdef055a830a-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.145807 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.145907 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mfvq\" (UniqueName: \"kubernetes.io/projected/17eed63d-a9fc-414e-9c70-347b51893cfa-kube-api-access-7mfvq\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.146085 5184 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.146196 5184 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1d9df18-d5a1-447d-ad5a-fdef055a830a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.146543 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.146665 5184 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17eed63d-a9fc-414e-9c70-347b51893cfa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.146794 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1d9df18-d5a1-447d-ad5a-fdef055a830a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.146904 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17eed63d-a9fc-414e-9c70-347b51893cfa-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.147010 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zrbv2\" (UniqueName: \"kubernetes.io/projected/a1d9df18-d5a1-447d-ad5a-fdef055a830a-kube-api-access-zrbv2\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.147152 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17eed63d-a9fc-414e-9c70-347b51893cfa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.148527 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-proxy-ca-bundles\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.148607 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-client-ca\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.153573 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-config\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.160813 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e775526-099c-4134-b128-2af393f0b3e9-tmp\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.161685 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e775526-099c-4134-b128-2af393f0b3e9-serving-cert\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.174192 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fbw7\" (UniqueName: \"kubernetes.io/projected/3e775526-099c-4134-b128-2af393f0b3e9-kube-api-access-9fbw7\") pod \"controller-manager-565b84484f-wf8fr\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.248279 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb07ce8-1cf0-4b3f-a935-64190b256410-serving-cert\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.248333 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cb07ce8-1cf0-4b3f-a935-64190b256410-tmp\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.248362 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-config\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.248401 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-client-ca\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.248438 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7fj8\" (UniqueName: \"kubernetes.io/projected/2cb07ce8-1cf0-4b3f-a935-64190b256410-kube-api-access-t7fj8\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.249175 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cb07ce8-1cf0-4b3f-a935-64190b256410-tmp\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.249895 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-client-ca\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.250569 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-config\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.252901 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb07ce8-1cf0-4b3f-a935-64190b256410-serving-cert\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.276120 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7fj8\" (UniqueName: \"kubernetes.io/projected/2cb07ce8-1cf0-4b3f-a935-64190b256410-kube-api-access-t7fj8\") pod \"route-controller-manager-64dd76cf5b-twbhz\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.296041 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.306076 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.476131 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx6mk" event={"ID":"45fdd519-1630-4afa-9780-7325691d8206","Type":"ContainerStarted","Data":"71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5"} Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.492780 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" event={"ID":"17eed63d-a9fc-414e-9c70-347b51893cfa","Type":"ContainerDied","Data":"d987a68c425d83fc9e616e2b1a4161702b0b724114eb82206c902e349af33d3c"} Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.492800 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-bbnrv" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.492836 5184 scope.go:117] "RemoveContainer" containerID="e24cc4e11b3d0234458860254c11bc775f91db3fdc483a595a9056e5efcf156e" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.508095 5184 generic.go:358] "Generic (PLEG): container finished" podID="ce605906-7727-4682-83a5-e18f9faeb789" containerID="dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b" exitCode=0 Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.508227 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4rr" event={"ID":"ce605906-7727-4682-83a5-e18f9faeb789","Type":"ContainerDied","Data":"dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b"} Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.520233 5184 generic.go:358] "Generic (PLEG): container finished" podID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerID="f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7" exitCode=0 Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.521082 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqqp2" event={"ID":"d91a31bb-cc85-4866-bb81-a3e9b0cc9362","Type":"ContainerDied","Data":"f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7"} Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.549189 5184 generic.go:358] "Generic (PLEG): container finished" podID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerID="5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9" exitCode=0 Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.549327 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phh8l" event={"ID":"57daa144-a296-461d-8a95-4a0266b3a6b8","Type":"ContainerDied","Data":"5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9"} Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.554212 5184 generic.go:358] "Generic (PLEG): container finished" podID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerID="50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff" exitCode=0 Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.554304 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6mx8" event={"ID":"8df70bc7-e513-41bf-94d8-5f79a9d10b64","Type":"ContainerDied","Data":"50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff"} Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.565348 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhkzs" event={"ID":"f2536f10-73e6-480a-9abb-2fd7a7e1a235","Type":"ContainerStarted","Data":"53d1a5bf920f82602c6621dc71551ebb20dd97b43060a9d0e884935bc9f86b83"} Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.568382 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" event={"ID":"a1d9df18-d5a1-447d-ad5a-fdef055a830a","Type":"ContainerDied","Data":"9a2061b94300167946e6762a8f804b3b2116ab934a8f7d396ffc2831b382917a"} Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.568533 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.575855 5184 scope.go:117] "RemoveContainer" containerID="818f2fbc5e6792c0f16efebf57cf83caa76a17d46ab3de37a9b0a5f5ff767e83" Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.580682 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz"] Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.581109 5184 generic.go:358] "Generic (PLEG): container finished" podID="27cbd345-0044-49d8-9192-d193df4c579e" containerID="3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62" exitCode=0 Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.581263 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnbt" event={"ID":"27cbd345-0044-49d8-9192-d193df4c579e","Type":"ContainerDied","Data":"3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62"} Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.586438 5184 generic.go:358] "Generic (PLEG): container finished" podID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerID="2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760" exitCode=0 Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.586482 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zt24v" event={"ID":"4ed8bc0c-5406-4893-9949-1342c8eb210c","Type":"ContainerDied","Data":"2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760"} Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.618430 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-bbnrv"] Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.620201 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-bbnrv"] Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.680742 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d"] Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.692048 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-x7z5d"] Mar 12 16:53:21 crc kubenswrapper[5184]: I0312 16:53:21.829713 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-565b84484f-wf8fr"] Mar 12 16:53:21 crc kubenswrapper[5184]: W0312 16:53:21.926633 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e775526_099c_4134_b128_2af393f0b3e9.slice/crio-f68b99a438ca61a2e233a3843d91581cafbcd85c3ba38b2bcf7e3fa4655076ff WatchSource:0}: Error finding container f68b99a438ca61a2e233a3843d91581cafbcd85c3ba38b2bcf7e3fa4655076ff: Status 404 returned error can't find the container with id f68b99a438ca61a2e233a3843d91581cafbcd85c3ba38b2bcf7e3fa4655076ff Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.406217 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17eed63d-a9fc-414e-9c70-347b51893cfa" path="/var/lib/kubelet/pods/17eed63d-a9fc-414e-9c70-347b51893cfa/volumes" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.407057 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d9df18-d5a1-447d-ad5a-fdef055a830a" path="/var/lib/kubelet/pods/a1d9df18-d5a1-447d-ad5a-fdef055a830a/volumes" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.593176 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnbt" event={"ID":"27cbd345-0044-49d8-9192-d193df4c579e","Type":"ContainerStarted","Data":"ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.595308 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zt24v" event={"ID":"4ed8bc0c-5406-4893-9949-1342c8eb210c","Type":"ContainerStarted","Data":"ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.597092 5184 generic.go:358] "Generic (PLEG): container finished" podID="45fdd519-1630-4afa-9780-7325691d8206" containerID="71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5" exitCode=0 Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.597151 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx6mk" event={"ID":"45fdd519-1630-4afa-9780-7325691d8206","Type":"ContainerDied","Data":"71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.600139 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" event={"ID":"2cb07ce8-1cf0-4b3f-a935-64190b256410","Type":"ContainerStarted","Data":"bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.600182 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" event={"ID":"2cb07ce8-1cf0-4b3f-a935-64190b256410","Type":"ContainerStarted","Data":"7e3dbc4923d8dd6b2de4c84444e7f28f80f688f20decb004ae434c4eecb70017"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.600660 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.601703 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" event={"ID":"3e775526-099c-4134-b128-2af393f0b3e9","Type":"ContainerStarted","Data":"4ff698d56096c9c9fa5afc5ae18ccab705d0883dfb57f9151077948b331f1f0b"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.601760 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" event={"ID":"3e775526-099c-4134-b128-2af393f0b3e9","Type":"ContainerStarted","Data":"f68b99a438ca61a2e233a3843d91581cafbcd85c3ba38b2bcf7e3fa4655076ff"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.619147 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hwnbt" podStartSLOduration=3.131977939 podStartE2EDuration="24.619129706s" podCreationTimestamp="2026-03-12 16:52:58 +0000 UTC" firstStartedPulling="2026-03-12 16:52:59.19154793 +0000 UTC m=+121.732859269" lastFinishedPulling="2026-03-12 16:53:20.678699687 +0000 UTC m=+143.220011036" observedRunningTime="2026-03-12 16:53:22.618998322 +0000 UTC m=+145.160309661" watchObservedRunningTime="2026-03-12 16:53:22.619129706 +0000 UTC m=+145.160441045" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.621101 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4rr" event={"ID":"ce605906-7727-4682-83a5-e18f9faeb789","Type":"ContainerStarted","Data":"38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.623721 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqqp2" event={"ID":"d91a31bb-cc85-4866-bb81-a3e9b0cc9362","Type":"ContainerStarted","Data":"dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.624738 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.628077 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phh8l" event={"ID":"57daa144-a296-461d-8a95-4a0266b3a6b8","Type":"ContainerStarted","Data":"3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.630349 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6mx8" event={"ID":"8df70bc7-e513-41bf-94d8-5f79a9d10b64","Type":"ContainerStarted","Data":"7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.632243 5184 generic.go:358] "Generic (PLEG): container finished" podID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerID="53d1a5bf920f82602c6621dc71551ebb20dd97b43060a9d0e884935bc9f86b83" exitCode=0 Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.632305 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhkzs" event={"ID":"f2536f10-73e6-480a-9abb-2fd7a7e1a235","Type":"ContainerDied","Data":"53d1a5bf920f82602c6621dc71551ebb20dd97b43060a9d0e884935bc9f86b83"} Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.647085 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" podStartSLOduration=6.647063494 podStartE2EDuration="6.647063494s" podCreationTimestamp="2026-03-12 16:53:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:53:22.646687502 +0000 UTC m=+145.187998861" watchObservedRunningTime="2026-03-12 16:53:22.647063494 +0000 UTC m=+145.188374843" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.656747 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.772191 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zt24v" podStartSLOduration=3.363946735 podStartE2EDuration="22.772169612s" podCreationTimestamp="2026-03-12 16:53:00 +0000 UTC" firstStartedPulling="2026-03-12 16:53:01.26636775 +0000 UTC m=+123.807679089" lastFinishedPulling="2026-03-12 16:53:20.674590617 +0000 UTC m=+143.215901966" observedRunningTime="2026-03-12 16:53:22.700667368 +0000 UTC m=+145.241978717" watchObservedRunningTime="2026-03-12 16:53:22.772169612 +0000 UTC m=+145.313480951" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.774212 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-phh8l" podStartSLOduration=4.351886104 podStartE2EDuration="24.774202196s" podCreationTimestamp="2026-03-12 16:52:58 +0000 UTC" firstStartedPulling="2026-03-12 16:53:00.263529893 +0000 UTC m=+122.804841232" lastFinishedPulling="2026-03-12 16:53:20.685845985 +0000 UTC m=+143.227157324" observedRunningTime="2026-03-12 16:53:22.771665836 +0000 UTC m=+145.312977175" watchObservedRunningTime="2026-03-12 16:53:22.774202196 +0000 UTC m=+145.315513535" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.817537 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h6mx8" podStartSLOduration=5.463824789 podStartE2EDuration="23.817519493s" podCreationTimestamp="2026-03-12 16:52:59 +0000 UTC" firstStartedPulling="2026-03-12 16:53:02.307115685 +0000 UTC m=+124.848427014" lastFinishedPulling="2026-03-12 16:53:20.660810369 +0000 UTC m=+143.202121718" observedRunningTime="2026-03-12 16:53:22.81332489 +0000 UTC m=+145.354636229" watchObservedRunningTime="2026-03-12 16:53:22.817519493 +0000 UTC m=+145.358830832" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.919362 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nk4rr" podStartSLOduration=4.4281562 podStartE2EDuration="25.91934221s" podCreationTimestamp="2026-03-12 16:52:57 +0000 UTC" firstStartedPulling="2026-03-12 16:52:59.184752919 +0000 UTC m=+121.726064258" lastFinishedPulling="2026-03-12 16:53:20.675938909 +0000 UTC m=+143.217250268" observedRunningTime="2026-03-12 16:53:22.915796587 +0000 UTC m=+145.457107946" watchObservedRunningTime="2026-03-12 16:53:22.91934221 +0000 UTC m=+145.460653559" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.919503 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vqqp2" podStartSLOduration=4.507023355 podStartE2EDuration="24.919498385s" podCreationTimestamp="2026-03-12 16:52:58 +0000 UTC" firstStartedPulling="2026-03-12 16:53:00.250002952 +0000 UTC m=+122.791314291" lastFinishedPulling="2026-03-12 16:53:20.662477972 +0000 UTC m=+143.203789321" observedRunningTime="2026-03-12 16:53:22.898666903 +0000 UTC m=+145.439978242" watchObservedRunningTime="2026-03-12 16:53:22.919498385 +0000 UTC m=+145.460809734" Mar 12 16:53:22 crc kubenswrapper[5184]: I0312 16:53:22.953109 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" podStartSLOduration=6.9530865330000005 podStartE2EDuration="6.953086533s" podCreationTimestamp="2026-03-12 16:53:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:53:22.950548112 +0000 UTC m=+145.491859451" watchObservedRunningTime="2026-03-12 16:53:22.953086533 +0000 UTC m=+145.494397872" Mar 12 16:53:23 crc kubenswrapper[5184]: I0312 16:53:23.205581 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:23 crc kubenswrapper[5184]: I0312 16:53:23.342429 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 16:53:23 crc kubenswrapper[5184]: I0312 16:53:23.642414 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx6mk" event={"ID":"45fdd519-1630-4afa-9780-7325691d8206","Type":"ContainerStarted","Data":"45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba"} Mar 12 16:53:23 crc kubenswrapper[5184]: I0312 16:53:23.645125 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhkzs" event={"ID":"f2536f10-73e6-480a-9abb-2fd7a7e1a235","Type":"ContainerStarted","Data":"c023a41054bca562b40b232e212b3e2f72c9d1715cddb0ff41dd961f845adabe"} Mar 12 16:53:23 crc kubenswrapper[5184]: I0312 16:53:23.674589 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cx6mk" podStartSLOduration=4.15420413 podStartE2EDuration="22.674560809s" podCreationTimestamp="2026-03-12 16:53:01 +0000 UTC" firstStartedPulling="2026-03-12 16:53:02.302526622 +0000 UTC m=+124.843837971" lastFinishedPulling="2026-03-12 16:53:20.822883311 +0000 UTC m=+143.364194650" observedRunningTime="2026-03-12 16:53:23.670651806 +0000 UTC m=+146.211963155" watchObservedRunningTime="2026-03-12 16:53:23.674560809 +0000 UTC m=+146.215872148" Mar 12 16:53:23 crc kubenswrapper[5184]: I0312 16:53:23.703066 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zhkzs" podStartSLOduration=4.339975476 podStartE2EDuration="22.703040245s" podCreationTimestamp="2026-03-12 16:53:01 +0000 UTC" firstStartedPulling="2026-03-12 16:53:02.316089343 +0000 UTC m=+124.857400682" lastFinishedPulling="2026-03-12 16:53:20.679154102 +0000 UTC m=+143.220465451" observedRunningTime="2026-03-12 16:53:23.699783191 +0000 UTC m=+146.241094530" watchObservedRunningTime="2026-03-12 16:53:23.703040245 +0000 UTC m=+146.244351584" Mar 12 16:53:26 crc kubenswrapper[5184]: E0312 16:53:26.039827 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 16:53:26 crc kubenswrapper[5184]: E0312 16:53:26.047940 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 16:53:26 crc kubenswrapper[5184]: E0312 16:53:26.056008 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 16:53:26 crc kubenswrapper[5184]: E0312 16:53:26.056093 5184 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" podUID="4840f833-3dce-444b-8cad-3a7374af30e7" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Mar 12 16:53:27 crc kubenswrapper[5184]: I0312 16:53:27.156959 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-f99dz" Mar 12 16:53:27 crc kubenswrapper[5184]: I0312 16:53:27.205787 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.189432 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.189542 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.550769 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.550841 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.601507 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.601566 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.683694 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.687978 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.688079 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.724112 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.729479 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.746231 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.891778 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.891840 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:53:28 crc kubenswrapper[5184]: I0312 16:53:28.940768 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:53:29 crc kubenswrapper[5184]: E0312 16:53:29.261833 5184 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4840f833_3dce_444b_8cad_3a7374af30e7.slice/crio-33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614.scope\": RecentStats: unable to find data in memory cache]" Mar 12 16:53:29 crc kubenswrapper[5184]: I0312 16:53:29.682061 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-fm2vq_4840f833-3dce-444b-8cad-3a7374af30e7/kube-multus-additional-cni-plugins/0.log" Mar 12 16:53:29 crc kubenswrapper[5184]: I0312 16:53:29.682112 5184 generic.go:358] "Generic (PLEG): container finished" podID="4840f833-3dce-444b-8cad-3a7374af30e7" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" exitCode=137 Mar 12 16:53:29 crc kubenswrapper[5184]: I0312 16:53:29.682230 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" event={"ID":"4840f833-3dce-444b-8cad-3a7374af30e7","Type":"ContainerDied","Data":"33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614"} Mar 12 16:53:29 crc kubenswrapper[5184]: I0312 16:53:29.736928 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.195616 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-fm2vq_4840f833-3dce-444b-8cad-3a7374af30e7/kube-multus-additional-cni-plugins/0.log" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.195694 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.273041 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4840f833-3dce-444b-8cad-3a7374af30e7-ready\") pod \"4840f833-3dce-444b-8cad-3a7374af30e7\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.273154 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4840f833-3dce-444b-8cad-3a7374af30e7-tuning-conf-dir\") pod \"4840f833-3dce-444b-8cad-3a7374af30e7\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.273283 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsfb2\" (UniqueName: \"kubernetes.io/projected/4840f833-3dce-444b-8cad-3a7374af30e7-kube-api-access-nsfb2\") pod \"4840f833-3dce-444b-8cad-3a7374af30e7\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.273345 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4840f833-3dce-444b-8cad-3a7374af30e7-cni-sysctl-allowlist\") pod \"4840f833-3dce-444b-8cad-3a7374af30e7\" (UID: \"4840f833-3dce-444b-8cad-3a7374af30e7\") " Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.273348 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4840f833-3dce-444b-8cad-3a7374af30e7-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "4840f833-3dce-444b-8cad-3a7374af30e7" (UID: "4840f833-3dce-444b-8cad-3a7374af30e7"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.273759 5184 reconciler_common.go:299] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4840f833-3dce-444b-8cad-3a7374af30e7-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.274210 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4840f833-3dce-444b-8cad-3a7374af30e7-ready" (OuterVolumeSpecName: "ready") pod "4840f833-3dce-444b-8cad-3a7374af30e7" (UID: "4840f833-3dce-444b-8cad-3a7374af30e7"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.274579 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4840f833-3dce-444b-8cad-3a7374af30e7-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "4840f833-3dce-444b-8cad-3a7374af30e7" (UID: "4840f833-3dce-444b-8cad-3a7374af30e7"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.282408 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4840f833-3dce-444b-8cad-3a7374af30e7-kube-api-access-nsfb2" (OuterVolumeSpecName: "kube-api-access-nsfb2") pod "4840f833-3dce-444b-8cad-3a7374af30e7" (UID: "4840f833-3dce-444b-8cad-3a7374af30e7"). InnerVolumeSpecName "kube-api-access-nsfb2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.375074 5184 reconciler_common.go:299] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4840f833-3dce-444b-8cad-3a7374af30e7-ready\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.375115 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsfb2\" (UniqueName: \"kubernetes.io/projected/4840f833-3dce-444b-8cad-3a7374af30e7-kube-api-access-nsfb2\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.375131 5184 reconciler_common.go:299] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4840f833-3dce-444b-8cad-3a7374af30e7-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.510899 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqqp2"] Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.519052 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.519115 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.562325 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.595634 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.595700 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.626397 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.691784 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-fm2vq_4840f833-3dce-444b-8cad-3a7374af30e7/kube-multus-additional-cni-plugins/0.log" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.692327 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.692343 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-fm2vq" event={"ID":"4840f833-3dce-444b-8cad-3a7374af30e7","Type":"ContainerDied","Data":"0b978b30c0e1acb5a82743bafd0074aa28b88e7d42a2293ca045f786f56d7865"} Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.692443 5184 scope.go:117] "RemoveContainer" containerID="33f536e52d195ff2fcade83b7dd83eab23beb4f297d841b0ca600526a0ae7614" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.693785 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vqqp2" podUID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerName="registry-server" containerID="cri-o://dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535" gracePeriod=2 Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.716946 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-fm2vq"] Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.719969 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-fm2vq"] Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.731918 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:53:30 crc kubenswrapper[5184]: I0312 16:53:30.739513 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.113786 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phh8l"] Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.292827 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.390312 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-catalog-content\") pod \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.390380 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-utilities\") pod \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.390419 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc5j6\" (UniqueName: \"kubernetes.io/projected/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-kube-api-access-rc5j6\") pod \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\" (UID: \"d91a31bb-cc85-4866-bb81-a3e9b0cc9362\") " Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.391485 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-utilities" (OuterVolumeSpecName: "utilities") pod "d91a31bb-cc85-4866-bb81-a3e9b0cc9362" (UID: "d91a31bb-cc85-4866-bb81-a3e9b0cc9362"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.391657 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.396269 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-kube-api-access-rc5j6" (OuterVolumeSpecName: "kube-api-access-rc5j6") pod "d91a31bb-cc85-4866-bb81-a3e9b0cc9362" (UID: "d91a31bb-cc85-4866-bb81-a3e9b0cc9362"). InnerVolumeSpecName "kube-api-access-rc5j6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.399569 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.400242 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.440852 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.472256 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d91a31bb-cc85-4866-bb81-a3e9b0cc9362" (UID: "d91a31bb-cc85-4866-bb81-a3e9b0cc9362"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.493128 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.493158 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rc5j6\" (UniqueName: \"kubernetes.io/projected/d91a31bb-cc85-4866-bb81-a3e9b0cc9362-kube-api-access-rc5j6\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.699679 5184 generic.go:358] "Generic (PLEG): container finished" podID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerID="dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535" exitCode=0 Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.699814 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqqp2" event={"ID":"d91a31bb-cc85-4866-bb81-a3e9b0cc9362","Type":"ContainerDied","Data":"dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535"} Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.699861 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqqp2" event={"ID":"d91a31bb-cc85-4866-bb81-a3e9b0cc9362","Type":"ContainerDied","Data":"3027bc4bf9533295fde235c2075bfb962e84b9d71304eb80da844b84c486fab2"} Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.699882 5184 scope.go:117] "RemoveContainer" containerID="dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.699996 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqqp2" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.700274 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-phh8l" podUID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerName="registry-server" containerID="cri-o://3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c" gracePeriod=2 Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.723759 5184 scope.go:117] "RemoveContainer" containerID="f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.748335 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.757788 5184 scope.go:117] "RemoveContainer" containerID="b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.834476 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.834533 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.858738 5184 scope.go:117] "RemoveContainer" containerID="dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535" Mar 12 16:53:31 crc kubenswrapper[5184]: E0312 16:53:31.859245 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535\": container with ID starting with dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535 not found: ID does not exist" containerID="dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.859285 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535"} err="failed to get container status \"dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535\": rpc error: code = NotFound desc = could not find container \"dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535\": container with ID starting with dd843d432e4b2ba3b720b6af22bf377a9531fd17026ea1970fa29aba50e8d535 not found: ID does not exist" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.859322 5184 scope.go:117] "RemoveContainer" containerID="f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7" Mar 12 16:53:31 crc kubenswrapper[5184]: E0312 16:53:31.859646 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7\": container with ID starting with f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7 not found: ID does not exist" containerID="f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.859689 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7"} err="failed to get container status \"f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7\": rpc error: code = NotFound desc = could not find container \"f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7\": container with ID starting with f6cbf978e8fae74ddc55b5f87b919218867e1763a970c9e745969a7bc7cb1dc7 not found: ID does not exist" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.859726 5184 scope.go:117] "RemoveContainer" containerID="b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048" Mar 12 16:53:31 crc kubenswrapper[5184]: E0312 16:53:31.859996 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048\": container with ID starting with b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048 not found: ID does not exist" containerID="b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.860022 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048"} err="failed to get container status \"b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048\": rpc error: code = NotFound desc = could not find container \"b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048\": container with ID starting with b6dd89060db2c945f1de1a3aa3a675a7ece2cc13ab6bc77479b805d6987c5048 not found: ID does not exist" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.881908 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.911822 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqqp2"] Mar 12 16:53:31 crc kubenswrapper[5184]: I0312 16:53:31.914340 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vqqp2"] Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.119827 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.202203 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-catalog-content\") pod \"57daa144-a296-461d-8a95-4a0266b3a6b8\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.202263 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtz7r\" (UniqueName: \"kubernetes.io/projected/57daa144-a296-461d-8a95-4a0266b3a6b8-kube-api-access-mtz7r\") pod \"57daa144-a296-461d-8a95-4a0266b3a6b8\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.202290 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-utilities\") pod \"57daa144-a296-461d-8a95-4a0266b3a6b8\" (UID: \"57daa144-a296-461d-8a95-4a0266b3a6b8\") " Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.203799 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-utilities" (OuterVolumeSpecName: "utilities") pod "57daa144-a296-461d-8a95-4a0266b3a6b8" (UID: "57daa144-a296-461d-8a95-4a0266b3a6b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.209077 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57daa144-a296-461d-8a95-4a0266b3a6b8-kube-api-access-mtz7r" (OuterVolumeSpecName: "kube-api-access-mtz7r") pod "57daa144-a296-461d-8a95-4a0266b3a6b8" (UID: "57daa144-a296-461d-8a95-4a0266b3a6b8"). InnerVolumeSpecName "kube-api-access-mtz7r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.249248 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57daa144-a296-461d-8a95-4a0266b3a6b8" (UID: "57daa144-a296-461d-8a95-4a0266b3a6b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.303304 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.303338 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mtz7r\" (UniqueName: \"kubernetes.io/projected/57daa144-a296-461d-8a95-4a0266b3a6b8-kube-api-access-mtz7r\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.303349 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57daa144-a296-461d-8a95-4a0266b3a6b8-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.416210 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4840f833-3dce-444b-8cad-3a7374af30e7" path="/var/lib/kubelet/pods/4840f833-3dce-444b-8cad-3a7374af30e7/volumes" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.417482 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" path="/var/lib/kubelet/pods/d91a31bb-cc85-4866-bb81-a3e9b0cc9362/volumes" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.708625 5184 generic.go:358] "Generic (PLEG): container finished" podID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerID="3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c" exitCode=0 Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.708738 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phh8l" event={"ID":"57daa144-a296-461d-8a95-4a0266b3a6b8","Type":"ContainerDied","Data":"3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c"} Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.709244 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phh8l" event={"ID":"57daa144-a296-461d-8a95-4a0266b3a6b8","Type":"ContainerDied","Data":"a6c957889346f91943a60fd0888203d11ea4dcf7ca80b7eb4f7d22f3d9881ceb"} Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.708763 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phh8l" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.709281 5184 scope.go:117] "RemoveContainer" containerID="3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.738783 5184 scope.go:117] "RemoveContainer" containerID="5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.744865 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phh8l"] Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.759327 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-phh8l"] Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.768077 5184 scope.go:117] "RemoveContainer" containerID="8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.783505 5184 scope.go:117] "RemoveContainer" containerID="3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.784302 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:32 crc kubenswrapper[5184]: E0312 16:53:32.785859 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c\": container with ID starting with 3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c not found: ID does not exist" containerID="3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.785912 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c"} err="failed to get container status \"3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c\": rpc error: code = NotFound desc = could not find container \"3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c\": container with ID starting with 3c7401f47ea634a2cbc65b1ac6e32f619437d929de3b42d2ad2273d7f229dc7c not found: ID does not exist" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.785943 5184 scope.go:117] "RemoveContainer" containerID="5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9" Mar 12 16:53:32 crc kubenswrapper[5184]: E0312 16:53:32.786898 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9\": container with ID starting with 5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9 not found: ID does not exist" containerID="5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.786928 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9"} err="failed to get container status \"5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9\": rpc error: code = NotFound desc = could not find container \"5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9\": container with ID starting with 5c44083e9d2a4325811209495247ede4eefeb55de43964bd2bdabd759d650df9 not found: ID does not exist" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.787042 5184 scope.go:117] "RemoveContainer" containerID="8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4" Mar 12 16:53:32 crc kubenswrapper[5184]: E0312 16:53:32.791816 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4\": container with ID starting with 8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4 not found: ID does not exist" containerID="8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.791871 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4"} err="failed to get container status \"8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4\": rpc error: code = NotFound desc = could not find container \"8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4\": container with ID starting with 8071491e4eee92723c79f3374fd36c2457c9a8699e55e82941211c6862d697d4 not found: ID does not exist" Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.905316 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zt24v"] Mar 12 16:53:32 crc kubenswrapper[5184]: I0312 16:53:32.905686 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zt24v" podUID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerName="registry-server" containerID="cri-o://ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc" gracePeriod=2 Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.383364 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.541618 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-utilities\") pod \"4ed8bc0c-5406-4893-9949-1342c8eb210c\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.541730 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7dpt\" (UniqueName: \"kubernetes.io/projected/4ed8bc0c-5406-4893-9949-1342c8eb210c-kube-api-access-n7dpt\") pod \"4ed8bc0c-5406-4893-9949-1342c8eb210c\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.541863 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-catalog-content\") pod \"4ed8bc0c-5406-4893-9949-1342c8eb210c\" (UID: \"4ed8bc0c-5406-4893-9949-1342c8eb210c\") " Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.542397 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-utilities" (OuterVolumeSpecName: "utilities") pod "4ed8bc0c-5406-4893-9949-1342c8eb210c" (UID: "4ed8bc0c-5406-4893-9949-1342c8eb210c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.549351 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed8bc0c-5406-4893-9949-1342c8eb210c-kube-api-access-n7dpt" (OuterVolumeSpecName: "kube-api-access-n7dpt") pod "4ed8bc0c-5406-4893-9949-1342c8eb210c" (UID: "4ed8bc0c-5406-4893-9949-1342c8eb210c"). InnerVolumeSpecName "kube-api-access-n7dpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.572853 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ed8bc0c-5406-4893-9949-1342c8eb210c" (UID: "4ed8bc0c-5406-4893-9949-1342c8eb210c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.648265 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.648321 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ed8bc0c-5406-4893-9949-1342c8eb210c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.648339 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7dpt\" (UniqueName: \"kubernetes.io/projected/4ed8bc0c-5406-4893-9949-1342c8eb210c-kube-api-access-n7dpt\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.719403 5184 generic.go:358] "Generic (PLEG): container finished" podID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerID="ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc" exitCode=0 Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.719500 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zt24v" event={"ID":"4ed8bc0c-5406-4893-9949-1342c8eb210c","Type":"ContainerDied","Data":"ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc"} Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.719546 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zt24v" event={"ID":"4ed8bc0c-5406-4893-9949-1342c8eb210c","Type":"ContainerDied","Data":"1ba617cd16c7237854758e169428ac6bf1949387a43cc08a6b9f2694ce8716f7"} Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.719566 5184 scope.go:117] "RemoveContainer" containerID="ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.719594 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zt24v" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.736076 5184 scope.go:117] "RemoveContainer" containerID="2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.751508 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zt24v"] Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.753749 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zt24v"] Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.775244 5184 scope.go:117] "RemoveContainer" containerID="9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.800115 5184 scope.go:117] "RemoveContainer" containerID="ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc" Mar 12 16:53:33 crc kubenswrapper[5184]: E0312 16:53:33.800545 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc\": container with ID starting with ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc not found: ID does not exist" containerID="ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.800580 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc"} err="failed to get container status \"ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc\": rpc error: code = NotFound desc = could not find container \"ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc\": container with ID starting with ef1c22c19f8e78edd161093e1ea6b4c0a067ca0e7a22e3e00b845f632911bfbc not found: ID does not exist" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.800601 5184 scope.go:117] "RemoveContainer" containerID="2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760" Mar 12 16:53:33 crc kubenswrapper[5184]: E0312 16:53:33.800871 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760\": container with ID starting with 2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760 not found: ID does not exist" containerID="2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.800891 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760"} err="failed to get container status \"2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760\": rpc error: code = NotFound desc = could not find container \"2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760\": container with ID starting with 2697e0eb39fd6ca4e03ba55839aff521169c36c9d6bfbaf1a51667c594003760 not found: ID does not exist" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.800903 5184 scope.go:117] "RemoveContainer" containerID="9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737" Mar 12 16:53:33 crc kubenswrapper[5184]: E0312 16:53:33.801140 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737\": container with ID starting with 9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737 not found: ID does not exist" containerID="9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737" Mar 12 16:53:33 crc kubenswrapper[5184]: I0312 16:53:33.801173 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737"} err="failed to get container status \"9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737\": rpc error: code = NotFound desc = could not find container \"9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737\": container with ID starting with 9aa194baa8ab62dfc9c80df2abf156c1ba66b50d9469d7736c07bb95e454c737 not found: ID does not exist" Mar 12 16:53:34 crc kubenswrapper[5184]: I0312 16:53:34.405826 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed8bc0c-5406-4893-9949-1342c8eb210c" path="/var/lib/kubelet/pods/4ed8bc0c-5406-4893-9949-1342c8eb210c/volumes" Mar 12 16:53:34 crc kubenswrapper[5184]: I0312 16:53:34.407151 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57daa144-a296-461d-8a95-4a0266b3a6b8" path="/var/lib/kubelet/pods/57daa144-a296-461d-8a95-4a0266b3a6b8/volumes" Mar 12 16:53:35 crc kubenswrapper[5184]: I0312 16:53:35.504608 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zhkzs"] Mar 12 16:53:35 crc kubenswrapper[5184]: I0312 16:53:35.505238 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zhkzs" podUID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerName="registry-server" containerID="cri-o://c023a41054bca562b40b232e212b3e2f72c9d1715cddb0ff41dd961f845adabe" gracePeriod=2 Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.604369 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-565b84484f-wf8fr"] Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.604785 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" podUID="3e775526-099c-4134-b128-2af393f0b3e9" containerName="controller-manager" containerID="cri-o://4ff698d56096c9c9fa5afc5ae18ccab705d0883dfb57f9151077948b331f1f0b" gracePeriod=30 Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.633027 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz"] Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.633328 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" podUID="2cb07ce8-1cf0-4b3f-a935-64190b256410" containerName="route-controller-manager" containerID="cri-o://bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3" gracePeriod=30 Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.643350 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.643918 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerName="extract-utilities" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.643938 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerName="extract-utilities" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.643951 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerName="extract-utilities" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.643959 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerName="extract-utilities" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.643967 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerName="registry-server" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.643973 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerName="registry-server" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.643986 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4840f833-3dce-444b-8cad-3a7374af30e7" containerName="kube-multus-additional-cni-plugins" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.643993 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="4840f833-3dce-444b-8cad-3a7374af30e7" containerName="kube-multus-additional-cni-plugins" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644001 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerName="registry-server" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644009 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerName="registry-server" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644019 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerName="extract-content" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644026 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerName="extract-content" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644033 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerName="extract-content" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644040 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerName="extract-content" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644048 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerName="extract-content" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644055 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerName="extract-content" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644073 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerName="registry-server" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644082 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerName="registry-server" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644098 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerName="extract-utilities" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644105 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerName="extract-utilities" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644216 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="4840f833-3dce-444b-8cad-3a7374af30e7" containerName="kube-multus-additional-cni-plugins" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644228 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="57daa144-a296-461d-8a95-4a0266b3a6b8" containerName="registry-server" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644234 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="d91a31bb-cc85-4866-bb81-a3e9b0cc9362" containerName="registry-server" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.644247 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ed8bc0c-5406-4893-9949-1342c8eb210c" containerName="registry-server" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.687638 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.687785 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.690349 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.691648 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.749699 5184 generic.go:358] "Generic (PLEG): container finished" podID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerID="c023a41054bca562b40b232e212b3e2f72c9d1715cddb0ff41dd961f845adabe" exitCode=0 Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.750067 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhkzs" event={"ID":"f2536f10-73e6-480a-9abb-2fd7a7e1a235","Type":"ContainerDied","Data":"c023a41054bca562b40b232e212b3e2f72c9d1715cddb0ff41dd961f845adabe"} Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.788535 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"1aafc06d-e1d9-4e41-a45d-6fece1c21edb\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.788624 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"1aafc06d-e1d9-4e41-a45d-6fece1c21edb\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.889612 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"1aafc06d-e1d9-4e41-a45d-6fece1c21edb\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.889696 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"1aafc06d-e1d9-4e41-a45d-6fece1c21edb\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.889811 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"1aafc06d-e1d9-4e41-a45d-6fece1c21edb\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 12 16:53:36 crc kubenswrapper[5184]: I0312 16:53:36.908886 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"1aafc06d-e1d9-4e41-a45d-6fece1c21edb\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.021071 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.411264 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Mar 12 16:53:37 crc kubenswrapper[5184]: W0312 16:53:37.424566 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1aafc06d_e1d9_4e41_a45d_6fece1c21edb.slice/crio-43db61a374dfa763cd38e2a875a743a3939a02cc5956f5a51480e08d2e065abd WatchSource:0}: Error finding container 43db61a374dfa763cd38e2a875a743a3939a02cc5956f5a51480e08d2e065abd: Status 404 returned error can't find the container with id 43db61a374dfa763cd38e2a875a743a3939a02cc5956f5a51480e08d2e065abd Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.596376 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.653932 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.679986 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc"] Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.680620 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cb07ce8-1cf0-4b3f-a935-64190b256410" containerName="route-controller-manager" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.680634 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb07ce8-1cf0-4b3f-a935-64190b256410" containerName="route-controller-manager" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.680647 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerName="extract-utilities" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.680653 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerName="extract-utilities" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.680663 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerName="registry-server" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.680669 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerName="registry-server" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.680692 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerName="extract-content" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.680698 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerName="extract-content" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.680792 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" containerName="registry-server" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.680813 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cb07ce8-1cf0-4b3f-a935-64190b256410" containerName="route-controller-manager" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.689328 5184 ???:1] "http: TLS handshake error from 192.168.126.11:56250: no serving certificate available for the kubelet" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.699784 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-utilities\") pod \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.699832 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb07ce8-1cf0-4b3f-a935-64190b256410-serving-cert\") pod \"2cb07ce8-1cf0-4b3f-a935-64190b256410\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.699871 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-config\") pod \"2cb07ce8-1cf0-4b3f-a935-64190b256410\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.699915 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nmkf\" (UniqueName: \"kubernetes.io/projected/f2536f10-73e6-480a-9abb-2fd7a7e1a235-kube-api-access-5nmkf\") pod \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.699942 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7fj8\" (UniqueName: \"kubernetes.io/projected/2cb07ce8-1cf0-4b3f-a935-64190b256410-kube-api-access-t7fj8\") pod \"2cb07ce8-1cf0-4b3f-a935-64190b256410\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.699969 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-client-ca\") pod \"2cb07ce8-1cf0-4b3f-a935-64190b256410\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.700030 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cb07ce8-1cf0-4b3f-a935-64190b256410-tmp\") pod \"2cb07ce8-1cf0-4b3f-a935-64190b256410\" (UID: \"2cb07ce8-1cf0-4b3f-a935-64190b256410\") " Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.700049 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-catalog-content\") pod \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\" (UID: \"f2536f10-73e6-480a-9abb-2fd7a7e1a235\") " Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.700912 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-utilities" (OuterVolumeSpecName: "utilities") pod "f2536f10-73e6-480a-9abb-2fd7a7e1a235" (UID: "f2536f10-73e6-480a-9abb-2fd7a7e1a235"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.701119 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cb07ce8-1cf0-4b3f-a935-64190b256410-tmp" (OuterVolumeSpecName: "tmp") pod "2cb07ce8-1cf0-4b3f-a935-64190b256410" (UID: "2cb07ce8-1cf0-4b3f-a935-64190b256410"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.701334 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-client-ca" (OuterVolumeSpecName: "client-ca") pod "2cb07ce8-1cf0-4b3f-a935-64190b256410" (UID: "2cb07ce8-1cf0-4b3f-a935-64190b256410"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.701346 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-config" (OuterVolumeSpecName: "config") pod "2cb07ce8-1cf0-4b3f-a935-64190b256410" (UID: "2cb07ce8-1cf0-4b3f-a935-64190b256410"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.705811 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb07ce8-1cf0-4b3f-a935-64190b256410-kube-api-access-t7fj8" (OuterVolumeSpecName: "kube-api-access-t7fj8") pod "2cb07ce8-1cf0-4b3f-a935-64190b256410" (UID: "2cb07ce8-1cf0-4b3f-a935-64190b256410"). InnerVolumeSpecName "kube-api-access-t7fj8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.706248 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2536f10-73e6-480a-9abb-2fd7a7e1a235-kube-api-access-5nmkf" (OuterVolumeSpecName: "kube-api-access-5nmkf") pod "f2536f10-73e6-480a-9abb-2fd7a7e1a235" (UID: "f2536f10-73e6-480a-9abb-2fd7a7e1a235"). InnerVolumeSpecName "kube-api-access-5nmkf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.706489 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb07ce8-1cf0-4b3f-a935-64190b256410-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2cb07ce8-1cf0-4b3f-a935-64190b256410" (UID: "2cb07ce8-1cf0-4b3f-a935-64190b256410"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.757537 5184 generic.go:358] "Generic (PLEG): container finished" podID="2cb07ce8-1cf0-4b3f-a935-64190b256410" containerID="bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3" exitCode=0 Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.759645 5184 generic.go:358] "Generic (PLEG): container finished" podID="3e775526-099c-4134-b128-2af393f0b3e9" containerID="4ff698d56096c9c9fa5afc5ae18ccab705d0883dfb57f9151077948b331f1f0b" exitCode=0 Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.801352 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5nmkf\" (UniqueName: \"kubernetes.io/projected/f2536f10-73e6-480a-9abb-2fd7a7e1a235-kube-api-access-5nmkf\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.801400 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t7fj8\" (UniqueName: \"kubernetes.io/projected/2cb07ce8-1cf0-4b3f-a935-64190b256410-kube-api-access-t7fj8\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.801411 5184 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.801422 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2cb07ce8-1cf0-4b3f-a935-64190b256410-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.801431 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.801439 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb07ce8-1cf0-4b3f-a935-64190b256410-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.801448 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb07ce8-1cf0-4b3f-a935-64190b256410-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:37 crc kubenswrapper[5184]: I0312 16:53:37.965667 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.003980 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-config\") pod \"3e775526-099c-4134-b128-2af393f0b3e9\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.004269 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-client-ca\") pod \"3e775526-099c-4134-b128-2af393f0b3e9\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.004396 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e775526-099c-4134-b128-2af393f0b3e9-tmp\") pod \"3e775526-099c-4134-b128-2af393f0b3e9\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.004505 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e775526-099c-4134-b128-2af393f0b3e9-serving-cert\") pod \"3e775526-099c-4134-b128-2af393f0b3e9\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.004579 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fbw7\" (UniqueName: \"kubernetes.io/projected/3e775526-099c-4134-b128-2af393f0b3e9-kube-api-access-9fbw7\") pod \"3e775526-099c-4134-b128-2af393f0b3e9\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.004661 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-proxy-ca-bundles\") pod \"3e775526-099c-4134-b128-2af393f0b3e9\" (UID: \"3e775526-099c-4134-b128-2af393f0b3e9\") " Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.004707 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e775526-099c-4134-b128-2af393f0b3e9-tmp" (OuterVolumeSpecName: "tmp") pod "3e775526-099c-4134-b128-2af393f0b3e9" (UID: "3e775526-099c-4134-b128-2af393f0b3e9"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.004930 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "3e775526-099c-4134-b128-2af393f0b3e9" (UID: "3e775526-099c-4134-b128-2af393f0b3e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.005050 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-config" (OuterVolumeSpecName: "config") pod "3e775526-099c-4134-b128-2af393f0b3e9" (UID: "3e775526-099c-4134-b128-2af393f0b3e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.005323 5184 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.005420 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e775526-099c-4134-b128-2af393f0b3e9-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.005495 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.005455 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3e775526-099c-4134-b128-2af393f0b3e9" (UID: "3e775526-099c-4134-b128-2af393f0b3e9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.009190 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e775526-099c-4134-b128-2af393f0b3e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3e775526-099c-4134-b128-2af393f0b3e9" (UID: "3e775526-099c-4134-b128-2af393f0b3e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.009514 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e775526-099c-4134-b128-2af393f0b3e9-kube-api-access-9fbw7" (OuterVolumeSpecName: "kube-api-access-9fbw7") pod "3e775526-099c-4134-b128-2af393f0b3e9" (UID: "3e775526-099c-4134-b128-2af393f0b3e9"). InnerVolumeSpecName "kube-api-access-9fbw7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092072 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc"] Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092121 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" event={"ID":"2cb07ce8-1cf0-4b3f-a935-64190b256410","Type":"ContainerDied","Data":"bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3"} Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092143 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092165 5184 scope.go:117] "RemoveContainer" containerID="bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092182 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zhkzs" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092290 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092154 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz" event={"ID":"2cb07ce8-1cf0-4b3f-a935-64190b256410","Type":"ContainerDied","Data":"7e3dbc4923d8dd6b2de4c84444e7f28f80f688f20decb004ae434c4eecb70017"} Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092313 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" event={"ID":"3e775526-099c-4134-b128-2af393f0b3e9","Type":"ContainerDied","Data":"4ff698d56096c9c9fa5afc5ae18ccab705d0883dfb57f9151077948b331f1f0b"} Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092326 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6577658f54-49577"] Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092828 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e775526-099c-4134-b128-2af393f0b3e9" containerName="controller-manager" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092842 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e775526-099c-4134-b128-2af393f0b3e9" containerName="controller-manager" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.092930 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e775526-099c-4134-b128-2af393f0b3e9" containerName="controller-manager" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.106953 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6460ada9-b824-4600-9b8d-f3ec7a51d85e-tmp\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.107015 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfv72\" (UniqueName: \"kubernetes.io/projected/6460ada9-b824-4600-9b8d-f3ec7a51d85e-kube-api-access-dfv72\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.107260 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-config\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.107351 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-client-ca\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.107435 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6460ada9-b824-4600-9b8d-f3ec7a51d85e-serving-cert\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.107520 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e775526-099c-4134-b128-2af393f0b3e9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.107536 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9fbw7\" (UniqueName: \"kubernetes.io/projected/3e775526-099c-4134-b128-2af393f0b3e9-kube-api-access-9fbw7\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.107547 5184 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e775526-099c-4134-b128-2af393f0b3e9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.113469 5184 scope.go:117] "RemoveContainer" containerID="bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3" Mar 12 16:53:38 crc kubenswrapper[5184]: E0312 16:53:38.114226 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3\": container with ID starting with bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3 not found: ID does not exist" containerID="bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.114276 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3"} err="failed to get container status \"bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3\": rpc error: code = NotFound desc = could not find container \"bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3\": container with ID starting with bf61f08ec042f9f4129fd00e4e72b547e103414a518efac4ee14962a8f47aad3 not found: ID does not exist" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.114309 5184 scope.go:117] "RemoveContainer" containerID="4ff698d56096c9c9fa5afc5ae18ccab705d0883dfb57f9151077948b331f1f0b" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.192768 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6577658f54-49577"] Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.192811 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"1aafc06d-e1d9-4e41-a45d-6fece1c21edb","Type":"ContainerStarted","Data":"43db61a374dfa763cd38e2a875a743a3939a02cc5956f5a51480e08d2e065abd"} Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.192845 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz"] Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.192865 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd76cf5b-twbhz"] Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.192885 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zhkzs" event={"ID":"f2536f10-73e6-480a-9abb-2fd7a7e1a235","Type":"ContainerDied","Data":"caeb2e5fe1e893a88da6f83e36df6a85da9fa914f65edaf21a6de81d965e659f"} Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.192920 5184 scope.go:117] "RemoveContainer" containerID="c023a41054bca562b40b232e212b3e2f72c9d1715cddb0ff41dd961f845adabe" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.192960 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.208806 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f315761f-9336-48f0-ac72-2955d5f6d79c-serving-cert\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.208854 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlhzx\" (UniqueName: \"kubernetes.io/projected/f315761f-9336-48f0-ac72-2955d5f6d79c-kube-api-access-jlhzx\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.208973 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfv72\" (UniqueName: \"kubernetes.io/projected/6460ada9-b824-4600-9b8d-f3ec7a51d85e-kube-api-access-dfv72\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.209028 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-config\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.209058 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-config\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.209085 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-client-ca\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.209117 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-client-ca\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.209150 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-proxy-ca-bundles\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.209174 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f315761f-9336-48f0-ac72-2955d5f6d79c-tmp\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.209197 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6460ada9-b824-4600-9b8d-f3ec7a51d85e-serving-cert\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.209256 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6460ada9-b824-4600-9b8d-f3ec7a51d85e-tmp\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.210320 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6460ada9-b824-4600-9b8d-f3ec7a51d85e-tmp\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.210807 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-client-ca\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.210949 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-config\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.211799 5184 scope.go:117] "RemoveContainer" containerID="53d1a5bf920f82602c6621dc71551ebb20dd97b43060a9d0e884935bc9f86b83" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.214202 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6460ada9-b824-4600-9b8d-f3ec7a51d85e-serving-cert\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.228806 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfv72\" (UniqueName: \"kubernetes.io/projected/6460ada9-b824-4600-9b8d-f3ec7a51d85e-kube-api-access-dfv72\") pod \"route-controller-manager-f8f9698-g8bfc\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.241615 5184 scope.go:117] "RemoveContainer" containerID="bbd73e6b8e94baf2d0c945fb35b4bd2fd4f2e636ad902ebc523e1d05cfb10d5f" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.310874 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-config\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.311274 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-client-ca\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.311320 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-proxy-ca-bundles\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.311343 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f315761f-9336-48f0-ac72-2955d5f6d79c-tmp\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.311429 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f315761f-9336-48f0-ac72-2955d5f6d79c-serving-cert\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.311453 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlhzx\" (UniqueName: \"kubernetes.io/projected/f315761f-9336-48f0-ac72-2955d5f6d79c-kube-api-access-jlhzx\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.312272 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-config\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.312292 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f315761f-9336-48f0-ac72-2955d5f6d79c-tmp\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.312520 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-client-ca\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.312800 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-proxy-ca-bundles\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.316823 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f315761f-9336-48f0-ac72-2955d5f6d79c-serving-cert\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.328652 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlhzx\" (UniqueName: \"kubernetes.io/projected/f315761f-9336-48f0-ac72-2955d5f6d79c-kube-api-access-jlhzx\") pod \"controller-manager-6577658f54-49577\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.409092 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb07ce8-1cf0-4b3f-a935-64190b256410" path="/var/lib/kubelet/pods/2cb07ce8-1cf0-4b3f-a935-64190b256410/volumes" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.415596 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.453702 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2536f10-73e6-480a-9abb-2fd7a7e1a235" (UID: "f2536f10-73e6-480a-9abb-2fd7a7e1a235"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.511825 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.513492 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2536f10-73e6-480a-9abb-2fd7a7e1a235-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.622877 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc"] Mar 12 16:53:38 crc kubenswrapper[5184]: W0312 16:53:38.633162 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6460ada9_b824_4600_9b8d_f3ec7a51d85e.slice/crio-5961d035b8dc4566c9e0e83c028c678da46741a283f651b3f77a153eff2db94c WatchSource:0}: Error finding container 5961d035b8dc4566c9e0e83c028c678da46741a283f651b3f77a153eff2db94c: Status 404 returned error can't find the container with id 5961d035b8dc4566c9e0e83c028c678da46741a283f651b3f77a153eff2db94c Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.721739 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6577658f54-49577"] Mar 12 16:53:38 crc kubenswrapper[5184]: W0312 16:53:38.741470 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf315761f_9336_48f0_ac72_2955d5f6d79c.slice/crio-2d791ea364ac9b484f8d20419c4e501be91cb831a94e2295b0ede9e83f435466 WatchSource:0}: Error finding container 2d791ea364ac9b484f8d20419c4e501be91cb831a94e2295b0ede9e83f435466: Status 404 returned error can't find the container with id 2d791ea364ac9b484f8d20419c4e501be91cb831a94e2295b0ede9e83f435466 Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.777805 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6577658f54-49577" event={"ID":"f315761f-9336-48f0-ac72-2955d5f6d79c","Type":"ContainerStarted","Data":"2d791ea364ac9b484f8d20419c4e501be91cb831a94e2295b0ede9e83f435466"} Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.780637 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" event={"ID":"3e775526-099c-4134-b128-2af393f0b3e9","Type":"ContainerDied","Data":"f68b99a438ca61a2e233a3843d91581cafbcd85c3ba38b2bcf7e3fa4655076ff"} Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.780835 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-565b84484f-wf8fr" Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.784229 5184 generic.go:358] "Generic (PLEG): container finished" podID="1aafc06d-e1d9-4e41-a45d-6fece1c21edb" containerID="7038f4128fc977b8e7e10abb8091e4c8bf7a9fba290f1f111a9acdef880fc189" exitCode=0 Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.784309 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"1aafc06d-e1d9-4e41-a45d-6fece1c21edb","Type":"ContainerDied","Data":"7038f4128fc977b8e7e10abb8091e4c8bf7a9fba290f1f111a9acdef880fc189"} Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.784754 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zhkzs"] Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.786807 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" event={"ID":"6460ada9-b824-4600-9b8d-f3ec7a51d85e","Type":"ContainerStarted","Data":"5961d035b8dc4566c9e0e83c028c678da46741a283f651b3f77a153eff2db94c"} Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.788932 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zhkzs"] Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.827976 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-565b84484f-wf8fr"] Mar 12 16:53:38 crc kubenswrapper[5184]: I0312 16:53:38.831729 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-565b84484f-wf8fr"] Mar 12 16:53:39 crc kubenswrapper[5184]: I0312 16:53:39.794117 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" event={"ID":"6460ada9-b824-4600-9b8d-f3ec7a51d85e","Type":"ContainerStarted","Data":"6a6100a97d67b8a6c8664c63df555cce1dbb27522c94908f16e60f5b60979519"} Mar 12 16:53:39 crc kubenswrapper[5184]: I0312 16:53:39.794897 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:39 crc kubenswrapper[5184]: I0312 16:53:39.795612 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6577658f54-49577" event={"ID":"f315761f-9336-48f0-ac72-2955d5f6d79c","Type":"ContainerStarted","Data":"13789a922929ecf99af0fabc2e8ca4c6faf723d3245394ed7946f41e15eca20d"} Mar 12 16:53:39 crc kubenswrapper[5184]: I0312 16:53:39.800455 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:39 crc kubenswrapper[5184]: I0312 16:53:39.814685 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" podStartSLOduration=3.814664502 podStartE2EDuration="3.814664502s" podCreationTimestamp="2026-03-12 16:53:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:53:39.81144829 +0000 UTC m=+162.352759649" watchObservedRunningTime="2026-03-12 16:53:39.814664502 +0000 UTC m=+162.355975851" Mar 12 16:53:39 crc kubenswrapper[5184]: I0312 16:53:39.831684 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6577658f54-49577" podStartSLOduration=3.8316620930000003 podStartE2EDuration="3.831662093s" podCreationTimestamp="2026-03-12 16:53:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:53:39.829862016 +0000 UTC m=+162.371173365" watchObservedRunningTime="2026-03-12 16:53:39.831662093 +0000 UTC m=+162.372973432" Mar 12 16:53:39 crc kubenswrapper[5184]: I0312 16:53:39.951412 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-qmxgv"] Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.071055 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.143110 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kube-api-access\") pod \"1aafc06d-e1d9-4e41-a45d-6fece1c21edb\" (UID: \"1aafc06d-e1d9-4e41-a45d-6fece1c21edb\") " Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.143270 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kubelet-dir\") pod \"1aafc06d-e1d9-4e41-a45d-6fece1c21edb\" (UID: \"1aafc06d-e1d9-4e41-a45d-6fece1c21edb\") " Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.143434 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1aafc06d-e1d9-4e41-a45d-6fece1c21edb" (UID: "1aafc06d-e1d9-4e41-a45d-6fece1c21edb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.143687 5184 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.149322 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1aafc06d-e1d9-4e41-a45d-6fece1c21edb" (UID: "1aafc06d-e1d9-4e41-a45d-6fece1c21edb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.244746 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1aafc06d-e1d9-4e41-a45d-6fece1c21edb-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.406047 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e775526-099c-4134-b128-2af393f0b3e9" path="/var/lib/kubelet/pods/3e775526-099c-4134-b128-2af393f0b3e9/volumes" Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.406794 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2536f10-73e6-480a-9abb-2fd7a7e1a235" path="/var/lib/kubelet/pods/f2536f10-73e6-480a-9abb-2fd7a7e1a235/volumes" Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.801571 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.801593 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"1aafc06d-e1d9-4e41-a45d-6fece1c21edb","Type":"ContainerDied","Data":"43db61a374dfa763cd38e2a875a743a3939a02cc5956f5a51480e08d2e065abd"} Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.801642 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43db61a374dfa763cd38e2a875a743a3939a02cc5956f5a51480e08d2e065abd" Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.802262 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:40 crc kubenswrapper[5184]: I0312 16:53:40.809658 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.034626 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.035635 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1aafc06d-e1d9-4e41-a45d-6fece1c21edb" containerName="pruner" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.035659 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aafc06d-e1d9-4e41-a45d-6fece1c21edb" containerName="pruner" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.035798 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="1aafc06d-e1d9-4e41-a45d-6fece1c21edb" containerName="pruner" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.286863 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.287015 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.288982 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.289241 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.398020 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-var-lock\") pod \"installer-12-crc\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.398063 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kubelet-dir\") pod \"installer-12-crc\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.398086 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kube-api-access\") pod \"installer-12-crc\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.499654 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-var-lock\") pod \"installer-12-crc\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.499696 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kubelet-dir\") pod \"installer-12-crc\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.499712 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kube-api-access\") pod \"installer-12-crc\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.499824 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kubelet-dir\") pod \"installer-12-crc\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.499886 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-var-lock\") pod \"installer-12-crc\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.523245 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kube-api-access\") pod \"installer-12-crc\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.602697 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:53:44 crc kubenswrapper[5184]: I0312 16:53:44.839553 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Mar 12 16:53:45 crc kubenswrapper[5184]: I0312 16:53:45.827161 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"05d9779c-971f-40a0-83e9-b21a6e9e9d2a","Type":"ContainerStarted","Data":"0a586e444b7d60924e5b5e2194d19edeeb09e71674e1dd9839867838b661cca0"} Mar 12 16:53:45 crc kubenswrapper[5184]: I0312 16:53:45.827526 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"05d9779c-971f-40a0-83e9-b21a6e9e9d2a","Type":"ContainerStarted","Data":"4e37ea05dcad2152b233f30eec956e9d2f6aa07bbc38b196bc705c942d85fbd0"} Mar 12 16:53:45 crc kubenswrapper[5184]: I0312 16:53:45.849099 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-12-crc" podStartSLOduration=1.849081404 podStartE2EDuration="1.849081404s" podCreationTimestamp="2026-03-12 16:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:53:45.848172815 +0000 UTC m=+168.389484154" watchObservedRunningTime="2026-03-12 16:53:45.849081404 +0000 UTC m=+168.390392733" Mar 12 16:53:56 crc kubenswrapper[5184]: I0312 16:53:56.614500 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6577658f54-49577"] Mar 12 16:53:56 crc kubenswrapper[5184]: I0312 16:53:56.615556 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6577658f54-49577" podUID="f315761f-9336-48f0-ac72-2955d5f6d79c" containerName="controller-manager" containerID="cri-o://13789a922929ecf99af0fabc2e8ca4c6faf723d3245394ed7946f41e15eca20d" gracePeriod=30 Mar 12 16:53:56 crc kubenswrapper[5184]: I0312 16:53:56.626961 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc"] Mar 12 16:53:56 crc kubenswrapper[5184]: I0312 16:53:56.627394 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" podUID="6460ada9-b824-4600-9b8d-f3ec7a51d85e" containerName="route-controller-manager" containerID="cri-o://6a6100a97d67b8a6c8664c63df555cce1dbb27522c94908f16e60f5b60979519" gracePeriod=30 Mar 12 16:53:56 crc kubenswrapper[5184]: I0312 16:53:56.890114 5184 generic.go:358] "Generic (PLEG): container finished" podID="f315761f-9336-48f0-ac72-2955d5f6d79c" containerID="13789a922929ecf99af0fabc2e8ca4c6faf723d3245394ed7946f41e15eca20d" exitCode=0 Mar 12 16:53:56 crc kubenswrapper[5184]: I0312 16:53:56.890166 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6577658f54-49577" event={"ID":"f315761f-9336-48f0-ac72-2955d5f6d79c","Type":"ContainerDied","Data":"13789a922929ecf99af0fabc2e8ca4c6faf723d3245394ed7946f41e15eca20d"} Mar 12 16:53:56 crc kubenswrapper[5184]: I0312 16:53:56.892552 5184 generic.go:358] "Generic (PLEG): container finished" podID="6460ada9-b824-4600-9b8d-f3ec7a51d85e" containerID="6a6100a97d67b8a6c8664c63df555cce1dbb27522c94908f16e60f5b60979519" exitCode=0 Mar 12 16:53:56 crc kubenswrapper[5184]: I0312 16:53:56.892635 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" event={"ID":"6460ada9-b824-4600-9b8d-f3ec7a51d85e","Type":"ContainerDied","Data":"6a6100a97d67b8a6c8664c63df555cce1dbb27522c94908f16e60f5b60979519"} Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.128219 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.151920 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w"] Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.153201 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6460ada9-b824-4600-9b8d-f3ec7a51d85e" containerName="route-controller-manager" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.153224 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="6460ada9-b824-4600-9b8d-f3ec7a51d85e" containerName="route-controller-manager" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.153367 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="6460ada9-b824-4600-9b8d-f3ec7a51d85e" containerName="route-controller-manager" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.161479 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.178476 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w"] Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263172 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6460ada9-b824-4600-9b8d-f3ec7a51d85e-tmp\") pod \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263255 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-client-ca\") pod \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263292 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-config\") pod \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263363 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfv72\" (UniqueName: \"kubernetes.io/projected/6460ada9-b824-4600-9b8d-f3ec7a51d85e-kube-api-access-dfv72\") pod \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263423 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6460ada9-b824-4600-9b8d-f3ec7a51d85e-serving-cert\") pod \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\" (UID: \"6460ada9-b824-4600-9b8d-f3ec7a51d85e\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263565 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-config\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263614 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c44f2d1-0bc2-4117-aeda-d802809862b0-tmp\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263636 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8mc5\" (UniqueName: \"kubernetes.io/projected/8c44f2d1-0bc2-4117-aeda-d802809862b0-kube-api-access-l8mc5\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263682 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-client-ca\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263758 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6460ada9-b824-4600-9b8d-f3ec7a51d85e-tmp" (OuterVolumeSpecName: "tmp") pod "6460ada9-b824-4600-9b8d-f3ec7a51d85e" (UID: "6460ada9-b824-4600-9b8d-f3ec7a51d85e"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263790 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c44f2d1-0bc2-4117-aeda-d802809862b0-serving-cert\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.263955 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6460ada9-b824-4600-9b8d-f3ec7a51d85e-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.264456 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-config" (OuterVolumeSpecName: "config") pod "6460ada9-b824-4600-9b8d-f3ec7a51d85e" (UID: "6460ada9-b824-4600-9b8d-f3ec7a51d85e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.265338 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-client-ca" (OuterVolumeSpecName: "client-ca") pod "6460ada9-b824-4600-9b8d-f3ec7a51d85e" (UID: "6460ada9-b824-4600-9b8d-f3ec7a51d85e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.270050 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6460ada9-b824-4600-9b8d-f3ec7a51d85e-kube-api-access-dfv72" (OuterVolumeSpecName: "kube-api-access-dfv72") pod "6460ada9-b824-4600-9b8d-f3ec7a51d85e" (UID: "6460ada9-b824-4600-9b8d-f3ec7a51d85e"). InnerVolumeSpecName "kube-api-access-dfv72". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.275201 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6460ada9-b824-4600-9b8d-f3ec7a51d85e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6460ada9-b824-4600-9b8d-f3ec7a51d85e" (UID: "6460ada9-b824-4600-9b8d-f3ec7a51d85e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.352749 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.365365 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c44f2d1-0bc2-4117-aeda-d802809862b0-serving-cert\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.365539 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-config\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.365597 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c44f2d1-0bc2-4117-aeda-d802809862b0-tmp\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.365633 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8mc5\" (UniqueName: \"kubernetes.io/projected/8c44f2d1-0bc2-4117-aeda-d802809862b0-kube-api-access-l8mc5\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.365695 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-client-ca\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.365808 5184 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.365841 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6460ada9-b824-4600-9b8d-f3ec7a51d85e-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.365862 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dfv72\" (UniqueName: \"kubernetes.io/projected/6460ada9-b824-4600-9b8d-f3ec7a51d85e-kube-api-access-dfv72\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.365879 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6460ada9-b824-4600-9b8d-f3ec7a51d85e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.366738 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c44f2d1-0bc2-4117-aeda-d802809862b0-tmp\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.366770 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-config\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.367085 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-client-ca\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.370478 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c44f2d1-0bc2-4117-aeda-d802809862b0-serving-cert\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.386940 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n"] Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.388249 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f315761f-9336-48f0-ac72-2955d5f6d79c" containerName="controller-manager" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.388293 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="f315761f-9336-48f0-ac72-2955d5f6d79c" containerName="controller-manager" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.388248 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8mc5\" (UniqueName: \"kubernetes.io/projected/8c44f2d1-0bc2-4117-aeda-d802809862b0-kube-api-access-l8mc5\") pod \"route-controller-manager-786c764d56-lj85w\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.388511 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="f315761f-9336-48f0-ac72-2955d5f6d79c" containerName="controller-manager" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.395748 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.398794 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n"] Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.466547 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlhzx\" (UniqueName: \"kubernetes.io/projected/f315761f-9336-48f0-ac72-2955d5f6d79c-kube-api-access-jlhzx\") pod \"f315761f-9336-48f0-ac72-2955d5f6d79c\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.466661 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f315761f-9336-48f0-ac72-2955d5f6d79c-serving-cert\") pod \"f315761f-9336-48f0-ac72-2955d5f6d79c\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.466696 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-config\") pod \"f315761f-9336-48f0-ac72-2955d5f6d79c\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.466724 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f315761f-9336-48f0-ac72-2955d5f6d79c-tmp\") pod \"f315761f-9336-48f0-ac72-2955d5f6d79c\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.466770 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-client-ca\") pod \"f315761f-9336-48f0-ac72-2955d5f6d79c\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.466812 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-proxy-ca-bundles\") pod \"f315761f-9336-48f0-ac72-2955d5f6d79c\" (UID: \"f315761f-9336-48f0-ac72-2955d5f6d79c\") " Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.466908 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-client-ca\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.466933 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bch97\" (UniqueName: \"kubernetes.io/projected/8d636cd5-682c-40f9-904c-994939248966-kube-api-access-bch97\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.466954 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d636cd5-682c-40f9-904c-994939248966-tmp\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.466971 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-config\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.467024 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-proxy-ca-bundles\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.467284 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f315761f-9336-48f0-ac72-2955d5f6d79c-tmp" (OuterVolumeSpecName: "tmp") pod "f315761f-9336-48f0-ac72-2955d5f6d79c" (UID: "f315761f-9336-48f0-ac72-2955d5f6d79c"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.467604 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d636cd5-682c-40f9-904c-994939248966-serving-cert\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.467636 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-client-ca" (OuterVolumeSpecName: "client-ca") pod "f315761f-9336-48f0-ac72-2955d5f6d79c" (UID: "f315761f-9336-48f0-ac72-2955d5f6d79c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.467762 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f315761f-9336-48f0-ac72-2955d5f6d79c-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.467775 5184 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.467775 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f315761f-9336-48f0-ac72-2955d5f6d79c" (UID: "f315761f-9336-48f0-ac72-2955d5f6d79c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.467872 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-config" (OuterVolumeSpecName: "config") pod "f315761f-9336-48f0-ac72-2955d5f6d79c" (UID: "f315761f-9336-48f0-ac72-2955d5f6d79c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.469836 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f315761f-9336-48f0-ac72-2955d5f6d79c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f315761f-9336-48f0-ac72-2955d5f6d79c" (UID: "f315761f-9336-48f0-ac72-2955d5f6d79c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.470442 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f315761f-9336-48f0-ac72-2955d5f6d79c-kube-api-access-jlhzx" (OuterVolumeSpecName: "kube-api-access-jlhzx") pod "f315761f-9336-48f0-ac72-2955d5f6d79c" (UID: "f315761f-9336-48f0-ac72-2955d5f6d79c"). InnerVolumeSpecName "kube-api-access-jlhzx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.474281 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.569293 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-client-ca\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.569582 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bch97\" (UniqueName: \"kubernetes.io/projected/8d636cd5-682c-40f9-904c-994939248966-kube-api-access-bch97\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.569605 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d636cd5-682c-40f9-904c-994939248966-tmp\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.569623 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-config\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.569660 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-proxy-ca-bundles\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.569690 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d636cd5-682c-40f9-904c-994939248966-serving-cert\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.569740 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jlhzx\" (UniqueName: \"kubernetes.io/projected/f315761f-9336-48f0-ac72-2955d5f6d79c-kube-api-access-jlhzx\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.569750 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f315761f-9336-48f0-ac72-2955d5f6d79c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.569759 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.569767 5184 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f315761f-9336-48f0-ac72-2955d5f6d79c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.573181 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-client-ca\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.573693 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-proxy-ca-bundles\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.575166 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-config\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.575812 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d636cd5-682c-40f9-904c-994939248966-serving-cert\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.575936 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d636cd5-682c-40f9-904c-994939248966-tmp\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.591588 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bch97\" (UniqueName: \"kubernetes.io/projected/8d636cd5-682c-40f9-904c-994939248966-kube-api-access-bch97\") pod \"controller-manager-7b6c7767d4-rcl8n\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.716673 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.872222 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w"] Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.898804 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" event={"ID":"8c44f2d1-0bc2-4117-aeda-d802809862b0","Type":"ContainerStarted","Data":"53516226b5cabc44cae548e01933b641d0b9186cb3d00da4ed2d600e9165dfec"} Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.900086 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6577658f54-49577" event={"ID":"f315761f-9336-48f0-ac72-2955d5f6d79c","Type":"ContainerDied","Data":"2d791ea364ac9b484f8d20419c4e501be91cb831a94e2295b0ede9e83f435466"} Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.900106 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6577658f54-49577" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.900116 5184 scope.go:117] "RemoveContainer" containerID="13789a922929ecf99af0fabc2e8ca4c6faf723d3245394ed7946f41e15eca20d" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.901216 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" event={"ID":"6460ada9-b824-4600-9b8d-f3ec7a51d85e","Type":"ContainerDied","Data":"5961d035b8dc4566c9e0e83c028c678da46741a283f651b3f77a153eff2db94c"} Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.901267 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.923311 5184 scope.go:117] "RemoveContainer" containerID="6a6100a97d67b8a6c8664c63df555cce1dbb27522c94908f16e60f5b60979519" Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.935321 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc"] Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.937755 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8f9698-g8bfc"] Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.955549 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n"] Mar 12 16:53:57 crc kubenswrapper[5184]: W0312 16:53:57.966116 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d636cd5_682c_40f9_904c_994939248966.slice/crio-6363eaf9b7ac8a3642b1fee87447c9080bceee5150a469605f9a34f7191a672a WatchSource:0}: Error finding container 6363eaf9b7ac8a3642b1fee87447c9080bceee5150a469605f9a34f7191a672a: Status 404 returned error can't find the container with id 6363eaf9b7ac8a3642b1fee87447c9080bceee5150a469605f9a34f7191a672a Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.972939 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6577658f54-49577"] Mar 12 16:53:57 crc kubenswrapper[5184]: I0312 16:53:57.974421 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6577658f54-49577"] Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.450791 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6460ada9-b824-4600-9b8d-f3ec7a51d85e" path="/var/lib/kubelet/pods/6460ada9-b824-4600-9b8d-f3ec7a51d85e/volumes" Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.455703 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f315761f-9336-48f0-ac72-2955d5f6d79c" path="/var/lib/kubelet/pods/f315761f-9336-48f0-ac72-2955d5f6d79c/volumes" Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.911602 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" event={"ID":"8c44f2d1-0bc2-4117-aeda-d802809862b0","Type":"ContainerStarted","Data":"51fab5c5c6ed2e20063ef9e243981196cd21a6869c9e15c7f70896001f68278a"} Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.912041 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.913827 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" event={"ID":"8d636cd5-682c-40f9-904c-994939248966","Type":"ContainerStarted","Data":"779779aab04ac5d4e006d42b375ae15fca1cbb6627b9bcc796e514381b90bc6d"} Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.914348 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.914417 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" event={"ID":"8d636cd5-682c-40f9-904c-994939248966","Type":"ContainerStarted","Data":"6363eaf9b7ac8a3642b1fee87447c9080bceee5150a469605f9a34f7191a672a"} Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.921850 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.927953 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.936162 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" podStartSLOduration=2.936143108 podStartE2EDuration="2.936143108s" podCreationTimestamp="2026-03-12 16:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:53:58.934136224 +0000 UTC m=+181.475447583" watchObservedRunningTime="2026-03-12 16:53:58.936143108 +0000 UTC m=+181.477454457" Mar 12 16:53:58 crc kubenswrapper[5184]: I0312 16:53:58.988683 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" podStartSLOduration=2.988666657 podStartE2EDuration="2.988666657s" podCreationTimestamp="2026-03-12 16:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:53:58.986773257 +0000 UTC m=+181.528084606" watchObservedRunningTime="2026-03-12 16:53:58.988666657 +0000 UTC m=+181.529978016" Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.140357 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555574-n2v5k"] Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.153701 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555574-n2v5k" Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.154053 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555574-n2v5k"] Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.156811 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.157440 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.157741 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.309297 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndbn9\" (UniqueName: \"kubernetes.io/projected/6301ddeb-2d36-4015-b622-c1fc9acaeac4-kube-api-access-ndbn9\") pod \"auto-csr-approver-29555574-n2v5k\" (UID: \"6301ddeb-2d36-4015-b622-c1fc9acaeac4\") " pod="openshift-infra/auto-csr-approver-29555574-n2v5k" Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.410493 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndbn9\" (UniqueName: \"kubernetes.io/projected/6301ddeb-2d36-4015-b622-c1fc9acaeac4-kube-api-access-ndbn9\") pod \"auto-csr-approver-29555574-n2v5k\" (UID: \"6301ddeb-2d36-4015-b622-c1fc9acaeac4\") " pod="openshift-infra/auto-csr-approver-29555574-n2v5k" Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.432703 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndbn9\" (UniqueName: \"kubernetes.io/projected/6301ddeb-2d36-4015-b622-c1fc9acaeac4-kube-api-access-ndbn9\") pod \"auto-csr-approver-29555574-n2v5k\" (UID: \"6301ddeb-2d36-4015-b622-c1fc9acaeac4\") " pod="openshift-infra/auto-csr-approver-29555574-n2v5k" Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.498637 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555574-n2v5k" Mar 12 16:54:00 crc kubenswrapper[5184]: I0312 16:54:00.912390 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555574-n2v5k"] Mar 12 16:54:00 crc kubenswrapper[5184]: W0312 16:54:00.924212 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6301ddeb_2d36_4015_b622_c1fc9acaeac4.slice/crio-fda935ea87dfd9b5526f72e01e3437df5eae36db8ccb99f7a9eb166ed9c157c5 WatchSource:0}: Error finding container fda935ea87dfd9b5526f72e01e3437df5eae36db8ccb99f7a9eb166ed9c157c5: Status 404 returned error can't find the container with id fda935ea87dfd9b5526f72e01e3437df5eae36db8ccb99f7a9eb166ed9c157c5 Mar 12 16:54:01 crc kubenswrapper[5184]: I0312 16:54:01.940025 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555574-n2v5k" event={"ID":"6301ddeb-2d36-4015-b622-c1fc9acaeac4","Type":"ContainerStarted","Data":"fda935ea87dfd9b5526f72e01e3437df5eae36db8ccb99f7a9eb166ed9c157c5"} Mar 12 16:54:04 crc kubenswrapper[5184]: I0312 16:54:04.371356 5184 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wh7wq" Mar 12 16:54:04 crc kubenswrapper[5184]: I0312 16:54:04.376348 5184 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wh7wq" Mar 12 16:54:04 crc kubenswrapper[5184]: I0312 16:54:04.964300 5184 generic.go:358] "Generic (PLEG): container finished" podID="6301ddeb-2d36-4015-b622-c1fc9acaeac4" containerID="a1457d37fd42b339caff33900bb0fc56e005536745fa570b93a0435c8f9b4f8b" exitCode=0 Mar 12 16:54:04 crc kubenswrapper[5184]: I0312 16:54:04.964466 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555574-n2v5k" event={"ID":"6301ddeb-2d36-4015-b622-c1fc9acaeac4","Type":"ContainerDied","Data":"a1457d37fd42b339caff33900bb0fc56e005536745fa570b93a0435c8f9b4f8b"} Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.012557 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" podUID="c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" containerName="oauth-openshift" containerID="cri-o://8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3" gracePeriod=15 Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.378176 5184 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2026-04-11 16:49:04 +0000 UTC" deadline="2026-04-07 21:14:14.652892552 +0000 UTC" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.378684 5184 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="628h20m9.274213138s" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.441140 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.477369 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-db5986f99-jp5mt"] Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.478322 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" containerName="oauth-openshift" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.478350 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" containerName="oauth-openshift" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.478544 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" containerName="oauth-openshift" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.487696 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-db5986f99-jp5mt"] Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.487898 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.588618 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-service-ca\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.588702 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-trusted-ca-bundle\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.588724 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-idp-0-file-data\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.588746 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-login\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.588776 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-cliconfig\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.588796 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-session\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.588949 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-dir\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589012 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-provider-selection\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589035 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mj6b\" (UniqueName: \"kubernetes.io/projected/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-kube-api-access-5mj6b\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589064 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-error\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589107 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-policies\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589113 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589254 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-serving-cert\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589308 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-router-certs\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589351 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-ocp-branding-template\") pod \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\" (UID: \"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc\") " Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589579 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-router-certs\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589671 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmch4\" (UniqueName: \"kubernetes.io/projected/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-kube-api-access-cmch4\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589704 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589827 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589873 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-template-login\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.589937 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-service-ca\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590002 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590066 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-template-error\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590115 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590128 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590158 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590204 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-audit-policies\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590222 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590353 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590416 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-audit-dir\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590451 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-session\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590492 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590503 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.590514 5184 reconciler_common.go:299] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.591602 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.591682 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.595594 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.596132 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.596776 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-kube-api-access-5mj6b" (OuterVolumeSpecName: "kube-api-access-5mj6b") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "kube-api-access-5mj6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.596849 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.597084 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.597701 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.599170 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.599700 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.599868 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" (UID: "c0f44cf7-b9cd-4bff-91f1-89b99d8627fc"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.691926 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692007 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-audit-dir\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692047 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-session\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692090 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-router-certs\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692133 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmch4\" (UniqueName: \"kubernetes.io/projected/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-kube-api-access-cmch4\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692165 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692234 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692278 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-template-login\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692268 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-audit-dir\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692318 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-service-ca\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692523 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692619 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-template-error\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692713 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692762 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692825 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-audit-policies\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692923 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692953 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692973 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.692995 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5mj6b\" (UniqueName: \"kubernetes.io/projected/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-kube-api-access-5mj6b\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.693013 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.693031 5184 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.693051 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.693070 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.693090 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.693110 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.693129 5184 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.693641 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-service-ca\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.693823 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.694214 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-audit-policies\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.694974 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.697658 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-session\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.699438 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-template-login\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.699688 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.700624 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.700815 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-template-error\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.702099 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-system-router-certs\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.702412 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.703545 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.716204 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmch4\" (UniqueName: \"kubernetes.io/projected/7b4b6f68-67a6-4052-a01f-e8b20bc162bb-kube-api-access-cmch4\") pod \"oauth-openshift-db5986f99-jp5mt\" (UID: \"7b4b6f68-67a6-4052-a01f-e8b20bc162bb\") " pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.825825 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.996828 5184 generic.go:358] "Generic (PLEG): container finished" podID="c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" containerID="8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3" exitCode=0 Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.997460 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.997790 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" event={"ID":"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc","Type":"ContainerDied","Data":"8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3"} Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.997853 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-qmxgv" event={"ID":"c0f44cf7-b9cd-4bff-91f1-89b99d8627fc","Type":"ContainerDied","Data":"b51b95881e509972968ee6a2ac0ba7b59179d74e6db578c1f976730bcb85b110"} Mar 12 16:54:05 crc kubenswrapper[5184]: I0312 16:54:05.997882 5184 scope.go:117] "RemoveContainer" containerID="8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3" Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.067685 5184 scope.go:117] "RemoveContainer" containerID="8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3" Mar 12 16:54:06 crc kubenswrapper[5184]: E0312 16:54:06.070014 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3\": container with ID starting with 8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3 not found: ID does not exist" containerID="8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3" Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.070061 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3"} err="failed to get container status \"8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3\": rpc error: code = NotFound desc = could not find container \"8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3\": container with ID starting with 8970f0896718cd710c2762d889d2c85baa1fec512eefac4216709518c4a467e3 not found: ID does not exist" Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.090318 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-qmxgv"] Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.098387 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-qmxgv"] Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.308363 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-db5986f99-jp5mt"] Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.379365 5184 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2026-04-11 16:49:04 +0000 UTC" deadline="2026-04-07 05:22:12.248501248 +0000 UTC" Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.379416 5184 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="612h28m5.86908873s" Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.382516 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555574-n2v5k" Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.431951 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f44cf7-b9cd-4bff-91f1-89b99d8627fc" path="/var/lib/kubelet/pods/c0f44cf7-b9cd-4bff-91f1-89b99d8627fc/volumes" Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.506555 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndbn9\" (UniqueName: \"kubernetes.io/projected/6301ddeb-2d36-4015-b622-c1fc9acaeac4-kube-api-access-ndbn9\") pod \"6301ddeb-2d36-4015-b622-c1fc9acaeac4\" (UID: \"6301ddeb-2d36-4015-b622-c1fc9acaeac4\") " Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.518131 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6301ddeb-2d36-4015-b622-c1fc9acaeac4-kube-api-access-ndbn9" (OuterVolumeSpecName: "kube-api-access-ndbn9") pod "6301ddeb-2d36-4015-b622-c1fc9acaeac4" (UID: "6301ddeb-2d36-4015-b622-c1fc9acaeac4"). InnerVolumeSpecName "kube-api-access-ndbn9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:54:06 crc kubenswrapper[5184]: I0312 16:54:06.608488 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndbn9\" (UniqueName: \"kubernetes.io/projected/6301ddeb-2d36-4015-b622-c1fc9acaeac4-kube-api-access-ndbn9\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:07 crc kubenswrapper[5184]: I0312 16:54:07.005394 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" event={"ID":"7b4b6f68-67a6-4052-a01f-e8b20bc162bb","Type":"ContainerStarted","Data":"d688e784310da0cd4a348dcbff0b966aa5ab526169959881f171f352898a2976"} Mar 12 16:54:07 crc kubenswrapper[5184]: I0312 16:54:07.005457 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" event={"ID":"7b4b6f68-67a6-4052-a01f-e8b20bc162bb","Type":"ContainerStarted","Data":"6dd2ad6fc50fdc44025f6cab2a33dd9a396c08218edff69a3e85b6e105f2fbdc"} Mar 12 16:54:07 crc kubenswrapper[5184]: I0312 16:54:07.005598 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:07 crc kubenswrapper[5184]: I0312 16:54:07.007474 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555574-n2v5k" Mar 12 16:54:07 crc kubenswrapper[5184]: I0312 16:54:07.007531 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555574-n2v5k" event={"ID":"6301ddeb-2d36-4015-b622-c1fc9acaeac4","Type":"ContainerDied","Data":"fda935ea87dfd9b5526f72e01e3437df5eae36db8ccb99f7a9eb166ed9c157c5"} Mar 12 16:54:07 crc kubenswrapper[5184]: I0312 16:54:07.007561 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fda935ea87dfd9b5526f72e01e3437df5eae36db8ccb99f7a9eb166ed9c157c5" Mar 12 16:54:07 crc kubenswrapper[5184]: I0312 16:54:07.035474 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" podStartSLOduration=28.035451465 podStartE2EDuration="28.035451465s" podCreationTimestamp="2026-03-12 16:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:54:07.029003139 +0000 UTC m=+189.570314508" watchObservedRunningTime="2026-03-12 16:54:07.035451465 +0000 UTC m=+189.576762814" Mar 12 16:54:07 crc kubenswrapper[5184]: I0312 16:54:07.306897 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-db5986f99-jp5mt" Mar 12 16:54:16 crc kubenswrapper[5184]: I0312 16:54:16.589243 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n"] Mar 12 16:54:16 crc kubenswrapper[5184]: I0312 16:54:16.589920 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" podUID="8d636cd5-682c-40f9-904c-994939248966" containerName="controller-manager" containerID="cri-o://779779aab04ac5d4e006d42b375ae15fca1cbb6627b9bcc796e514381b90bc6d" gracePeriod=30 Mar 12 16:54:16 crc kubenswrapper[5184]: I0312 16:54:16.613348 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w"] Mar 12 16:54:16 crc kubenswrapper[5184]: I0312 16:54:16.613885 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" podUID="8c44f2d1-0bc2-4117-aeda-d802809862b0" containerName="route-controller-manager" containerID="cri-o://51fab5c5c6ed2e20063ef9e243981196cd21a6869c9e15c7f70896001f68278a" gracePeriod=30 Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.066210 5184 generic.go:358] "Generic (PLEG): container finished" podID="8d636cd5-682c-40f9-904c-994939248966" containerID="779779aab04ac5d4e006d42b375ae15fca1cbb6627b9bcc796e514381b90bc6d" exitCode=0 Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.066283 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" event={"ID":"8d636cd5-682c-40f9-904c-994939248966","Type":"ContainerDied","Data":"779779aab04ac5d4e006d42b375ae15fca1cbb6627b9bcc796e514381b90bc6d"} Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.068159 5184 generic.go:358] "Generic (PLEG): container finished" podID="8c44f2d1-0bc2-4117-aeda-d802809862b0" containerID="51fab5c5c6ed2e20063ef9e243981196cd21a6869c9e15c7f70896001f68278a" exitCode=0 Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.068190 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" event={"ID":"8c44f2d1-0bc2-4117-aeda-d802809862b0","Type":"ContainerDied","Data":"51fab5c5c6ed2e20063ef9e243981196cd21a6869c9e15c7f70896001f68278a"} Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.101334 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.129047 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv"] Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.129718 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6301ddeb-2d36-4015-b622-c1fc9acaeac4" containerName="oc" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.129738 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="6301ddeb-2d36-4015-b622-c1fc9acaeac4" containerName="oc" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.129778 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c44f2d1-0bc2-4117-aeda-d802809862b0" containerName="route-controller-manager" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.129786 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c44f2d1-0bc2-4117-aeda-d802809862b0" containerName="route-controller-manager" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.129872 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c44f2d1-0bc2-4117-aeda-d802809862b0" containerName="route-controller-manager" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.129886 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="6301ddeb-2d36-4015-b622-c1fc9acaeac4" containerName="oc" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.135010 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.140520 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv"] Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.170008 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c44f2d1-0bc2-4117-aeda-d802809862b0-serving-cert\") pod \"8c44f2d1-0bc2-4117-aeda-d802809862b0\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.170198 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-client-ca\") pod \"8c44f2d1-0bc2-4117-aeda-d802809862b0\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.170270 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8mc5\" (UniqueName: \"kubernetes.io/projected/8c44f2d1-0bc2-4117-aeda-d802809862b0-kube-api-access-l8mc5\") pod \"8c44f2d1-0bc2-4117-aeda-d802809862b0\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.170338 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-config\") pod \"8c44f2d1-0bc2-4117-aeda-d802809862b0\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.170491 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c44f2d1-0bc2-4117-aeda-d802809862b0-tmp\") pod \"8c44f2d1-0bc2-4117-aeda-d802809862b0\" (UID: \"8c44f2d1-0bc2-4117-aeda-d802809862b0\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.171295 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c44f2d1-0bc2-4117-aeda-d802809862b0-tmp" (OuterVolumeSpecName: "tmp") pod "8c44f2d1-0bc2-4117-aeda-d802809862b0" (UID: "8c44f2d1-0bc2-4117-aeda-d802809862b0"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.172954 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-client-ca" (OuterVolumeSpecName: "client-ca") pod "8c44f2d1-0bc2-4117-aeda-d802809862b0" (UID: "8c44f2d1-0bc2-4117-aeda-d802809862b0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.173093 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-config" (OuterVolumeSpecName: "config") pod "8c44f2d1-0bc2-4117-aeda-d802809862b0" (UID: "8c44f2d1-0bc2-4117-aeda-d802809862b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.178585 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c44f2d1-0bc2-4117-aeda-d802809862b0-kube-api-access-l8mc5" (OuterVolumeSpecName: "kube-api-access-l8mc5") pod "8c44f2d1-0bc2-4117-aeda-d802809862b0" (UID: "8c44f2d1-0bc2-4117-aeda-d802809862b0"). InnerVolumeSpecName "kube-api-access-l8mc5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.186908 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c44f2d1-0bc2-4117-aeda-d802809862b0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8c44f2d1-0bc2-4117-aeda-d802809862b0" (UID: "8c44f2d1-0bc2-4117-aeda-d802809862b0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.272035 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86ssd\" (UniqueName: \"kubernetes.io/projected/3e546bc5-20de-4fbf-a686-e60ee7bc4417-kube-api-access-86ssd\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.272188 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e546bc5-20de-4fbf-a686-e60ee7bc4417-tmp\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.272243 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e546bc5-20de-4fbf-a686-e60ee7bc4417-client-ca\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.272276 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e546bc5-20de-4fbf-a686-e60ee7bc4417-serving-cert\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.272348 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e546bc5-20de-4fbf-a686-e60ee7bc4417-config\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.272459 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c44f2d1-0bc2-4117-aeda-d802809862b0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.272482 5184 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.272501 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l8mc5\" (UniqueName: \"kubernetes.io/projected/8c44f2d1-0bc2-4117-aeda-d802809862b0-kube-api-access-l8mc5\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.272518 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c44f2d1-0bc2-4117-aeda-d802809862b0-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.272535 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c44f2d1-0bc2-4117-aeda-d802809862b0-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.288751 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.312082 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76db88db48-sd57q"] Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.312732 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d636cd5-682c-40f9-904c-994939248966" containerName="controller-manager" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.312747 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d636cd5-682c-40f9-904c-994939248966" containerName="controller-manager" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.312881 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d636cd5-682c-40f9-904c-994939248966" containerName="controller-manager" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.319462 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.327688 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76db88db48-sd57q"] Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.373673 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-client-ca\") pod \"8d636cd5-682c-40f9-904c-994939248966\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.373721 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d636cd5-682c-40f9-904c-994939248966-tmp\") pod \"8d636cd5-682c-40f9-904c-994939248966\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.373762 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-proxy-ca-bundles\") pod \"8d636cd5-682c-40f9-904c-994939248966\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.373874 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-config\") pod \"8d636cd5-682c-40f9-904c-994939248966\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.373996 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bch97\" (UniqueName: \"kubernetes.io/projected/8d636cd5-682c-40f9-904c-994939248966-kube-api-access-bch97\") pod \"8d636cd5-682c-40f9-904c-994939248966\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.374024 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d636cd5-682c-40f9-904c-994939248966-serving-cert\") pod \"8d636cd5-682c-40f9-904c-994939248966\" (UID: \"8d636cd5-682c-40f9-904c-994939248966\") " Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.374152 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-serving-cert\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.374166 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d636cd5-682c-40f9-904c-994939248966-tmp" (OuterVolumeSpecName: "tmp") pod "8d636cd5-682c-40f9-904c-994939248966" (UID: "8d636cd5-682c-40f9-904c-994939248966"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.374185 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55gm\" (UniqueName: \"kubernetes.io/projected/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-kube-api-access-j55gm\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.374238 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-proxy-ca-bundles\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.374268 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-tmp\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.374299 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e546bc5-20de-4fbf-a686-e60ee7bc4417-config\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.374343 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86ssd\" (UniqueName: \"kubernetes.io/projected/3e546bc5-20de-4fbf-a686-e60ee7bc4417-kube-api-access-86ssd\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.374444 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-client-ca" (OuterVolumeSpecName: "client-ca") pod "8d636cd5-682c-40f9-904c-994939248966" (UID: "8d636cd5-682c-40f9-904c-994939248966"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.374531 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-client-ca\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.375161 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e546bc5-20de-4fbf-a686-e60ee7bc4417-tmp\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.375577 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3e546bc5-20de-4fbf-a686-e60ee7bc4417-tmp\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.375800 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e546bc5-20de-4fbf-a686-e60ee7bc4417-client-ca\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.375831 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e546bc5-20de-4fbf-a686-e60ee7bc4417-serving-cert\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.375867 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-config" (OuterVolumeSpecName: "config") pod "8d636cd5-682c-40f9-904c-994939248966" (UID: "8d636cd5-682c-40f9-904c-994939248966"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.375921 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-config\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.375955 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e546bc5-20de-4fbf-a686-e60ee7bc4417-config\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.376041 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8d636cd5-682c-40f9-904c-994939248966" (UID: "8d636cd5-682c-40f9-904c-994939248966"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.376115 5184 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.376136 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d636cd5-682c-40f9-904c-994939248966-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.376148 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-config\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.376625 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3e546bc5-20de-4fbf-a686-e60ee7bc4417-client-ca\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.383499 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d636cd5-682c-40f9-904c-994939248966-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8d636cd5-682c-40f9-904c-994939248966" (UID: "8d636cd5-682c-40f9-904c-994939248966"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.383591 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d636cd5-682c-40f9-904c-994939248966-kube-api-access-bch97" (OuterVolumeSpecName: "kube-api-access-bch97") pod "8d636cd5-682c-40f9-904c-994939248966" (UID: "8d636cd5-682c-40f9-904c-994939248966"). InnerVolumeSpecName "kube-api-access-bch97". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.384172 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e546bc5-20de-4fbf-a686-e60ee7bc4417-serving-cert\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.394058 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86ssd\" (UniqueName: \"kubernetes.io/projected/3e546bc5-20de-4fbf-a686-e60ee7bc4417-kube-api-access-86ssd\") pod \"route-controller-manager-7fc79db845-7wnrv\" (UID: \"3e546bc5-20de-4fbf-a686-e60ee7bc4417\") " pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.477022 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-client-ca\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.477179 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-config\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.477248 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-serving-cert\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.477312 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j55gm\" (UniqueName: \"kubernetes.io/projected/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-kube-api-access-j55gm\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.477362 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-proxy-ca-bundles\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.477444 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-tmp\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.477621 5184 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d636cd5-682c-40f9-904c-994939248966-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.477643 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bch97\" (UniqueName: \"kubernetes.io/projected/8d636cd5-682c-40f9-904c-994939248966-kube-api-access-bch97\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.477663 5184 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d636cd5-682c-40f9-904c-994939248966-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.480047 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-client-ca\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.480563 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-tmp\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.481783 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-config\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.483943 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-proxy-ca-bundles\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.486476 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-serving-cert\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.497713 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.512665 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55gm\" (UniqueName: \"kubernetes.io/projected/8d9d1636-a1ce-4e90-9c5c-17865caaf0bc-kube-api-access-j55gm\") pod \"controller-manager-76db88db48-sd57q\" (UID: \"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc\") " pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.643227 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.884306 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76db88db48-sd57q"] Mar 12 16:54:17 crc kubenswrapper[5184]: I0312 16:54:17.993254 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv"] Mar 12 16:54:17 crc kubenswrapper[5184]: W0312 16:54:17.999488 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e546bc5_20de_4fbf_a686_e60ee7bc4417.slice/crio-feda00ab28dc3e38c0653a956bb439d208a177efaed10103dc33f23186909af1 WatchSource:0}: Error finding container feda00ab28dc3e38c0653a956bb439d208a177efaed10103dc33f23186909af1: Status 404 returned error can't find the container with id feda00ab28dc3e38c0653a956bb439d208a177efaed10103dc33f23186909af1 Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.092528 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" event={"ID":"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc","Type":"ContainerStarted","Data":"62fb02f24ac9e1c7f6506ee646c2f45f5249a467efa68715a59af3ad3f41deba"} Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.112423 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.112448 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n" event={"ID":"8d636cd5-682c-40f9-904c-994939248966","Type":"ContainerDied","Data":"6363eaf9b7ac8a3642b1fee87447c9080bceee5150a469605f9a34f7191a672a"} Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.112561 5184 scope.go:117] "RemoveContainer" containerID="779779aab04ac5d4e006d42b375ae15fca1cbb6627b9bcc796e514381b90bc6d" Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.116821 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" event={"ID":"8c44f2d1-0bc2-4117-aeda-d802809862b0","Type":"ContainerDied","Data":"53516226b5cabc44cae548e01933b641d0b9186cb3d00da4ed2d600e9165dfec"} Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.117169 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w" Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.118869 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" event={"ID":"3e546bc5-20de-4fbf-a686-e60ee7bc4417","Type":"ContainerStarted","Data":"feda00ab28dc3e38c0653a956bb439d208a177efaed10103dc33f23186909af1"} Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.154233 5184 scope.go:117] "RemoveContainer" containerID="51fab5c5c6ed2e20063ef9e243981196cd21a6869c9e15c7f70896001f68278a" Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.195931 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n"] Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.198985 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b6c7767d4-rcl8n"] Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.209917 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w"] Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.217452 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786c764d56-lj85w"] Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.409554 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c44f2d1-0bc2-4117-aeda-d802809862b0" path="/var/lib/kubelet/pods/8c44f2d1-0bc2-4117-aeda-d802809862b0/volumes" Mar 12 16:54:18 crc kubenswrapper[5184]: I0312 16:54:18.410086 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d636cd5-682c-40f9-904c-994939248966" path="/var/lib/kubelet/pods/8d636cd5-682c-40f9-904c-994939248966/volumes" Mar 12 16:54:19 crc kubenswrapper[5184]: I0312 16:54:19.130761 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" event={"ID":"3e546bc5-20de-4fbf-a686-e60ee7bc4417","Type":"ContainerStarted","Data":"bf4b1b38790ac48cd140d2c091d4fe87fca9bc852a7bdf262d97318c35386c18"} Mar 12 16:54:19 crc kubenswrapper[5184]: I0312 16:54:19.131140 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:19 crc kubenswrapper[5184]: I0312 16:54:19.133157 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" event={"ID":"8d9d1636-a1ce-4e90-9c5c-17865caaf0bc","Type":"ContainerStarted","Data":"c25626ae135488e1d36460452318e81f8a96316c61b9554e1e7148348b8557e6"} Mar 12 16:54:19 crc kubenswrapper[5184]: I0312 16:54:19.133498 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:19 crc kubenswrapper[5184]: I0312 16:54:19.136588 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" Mar 12 16:54:19 crc kubenswrapper[5184]: I0312 16:54:19.142579 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" Mar 12 16:54:19 crc kubenswrapper[5184]: I0312 16:54:19.150547 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fc79db845-7wnrv" podStartSLOduration=3.15053051 podStartE2EDuration="3.15053051s" podCreationTimestamp="2026-03-12 16:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:54:19.149364962 +0000 UTC m=+201.690676311" watchObservedRunningTime="2026-03-12 16:54:19.15053051 +0000 UTC m=+201.691841849" Mar 12 16:54:19 crc kubenswrapper[5184]: I0312 16:54:19.209025 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76db88db48-sd57q" podStartSLOduration=3.209009419 podStartE2EDuration="3.209009419s" podCreationTimestamp="2026-03-12 16:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:54:19.195746617 +0000 UTC m=+201.737058016" watchObservedRunningTime="2026-03-12 16:54:19.209009419 +0000 UTC m=+201.750320758" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.047305 5184 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.061286 5184 kubelet.go:2547] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.061365 5184 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.061390 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062196 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4" gracePeriod=15 Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062215 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5" gracePeriod=15 Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062269 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062293 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062305 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="setup" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062312 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="setup" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062333 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062341 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062351 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062359 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062389 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062399 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062400 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c" gracePeriod=15 Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062424 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062434 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062447 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062455 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062408 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51" gracePeriod=15 Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062467 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062554 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062682 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062695 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062709 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062719 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062728 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062737 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062746 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062851 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062860 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062876 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062886 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.063009 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.063027 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.062520 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" containerID="cri-o://408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5" gracePeriod=15 Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.065671 5184 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="3a14caf222afb62aaabdc47808b6f944" podUID="57755cc5f99000cc11e193051474d4e2" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.087425 5184 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.114714 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.168304 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.168364 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.168466 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.168494 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.168536 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.168558 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.168579 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.168606 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.168637 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.168668 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.270858 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.270927 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.270964 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271002 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271130 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271149 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271167 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271189 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271210 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271233 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271474 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271803 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271889 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271928 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.271951 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.272005 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.272007 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.272098 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.272257 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.272413 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: I0312 16:54:23.411689 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:54:23 crc kubenswrapper[5184]: E0312 16:54:23.431856 5184 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c264c88bbebcf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f7dbc7e1ee9c187a863ef9b473fad27b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:54:23.430519759 +0000 UTC m=+205.971831098,LastTimestamp:2026-03-12 16:54:23.430519759 +0000 UTC m=+205.971831098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.165641 5184 generic.go:358] "Generic (PLEG): container finished" podID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" containerID="0a586e444b7d60924e5b5e2194d19edeeb09e71674e1dd9839867838b661cca0" exitCode=0 Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.165737 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"05d9779c-971f-40a0-83e9-b21a6e9e9d2a","Type":"ContainerDied","Data":"0a586e444b7d60924e5b5e2194d19edeeb09e71674e1dd9839867838b661cca0"} Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.166891 5184 status_manager.go:895] "Failed to get status for pod" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.167136 5184 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.169900 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.171714 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.172843 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c" exitCode=0 Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.172865 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5" exitCode=0 Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.172876 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4" exitCode=0 Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.172885 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51" exitCode=2 Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.173124 5184 scope.go:117] "RemoveContainer" containerID="c5c91e816d332f9146ddc05817c56c9d67c53b64a57f352bf2e9af1b2fdb1ba4" Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.175594 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f7dbc7e1ee9c187a863ef9b473fad27b","Type":"ContainerStarted","Data":"f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e"} Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.175624 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f7dbc7e1ee9c187a863ef9b473fad27b","Type":"ContainerStarted","Data":"dcf7c3280a50a31b36a14d6bf5b1d6a6cc19fafdf19f95fcc9498818868fb13f"} Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.176416 5184 status_manager.go:895] "Failed to get status for pod" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:24 crc kubenswrapper[5184]: I0312 16:54:24.176797 5184 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.208445 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.499569 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.500484 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.501124 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.501324 5184 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.501778 5184 status_manager.go:895] "Failed to get status for pod" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.502169 5184 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.502766 5184 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.503044 5184 status_manager.go:895] "Failed to get status for pod" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.503517 5184 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.604725 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.604779 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.604821 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kube-api-access\") pod \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.604871 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.604894 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-var-lock\") pod \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.604892 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.604945 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.604981 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-var-lock" (OuterVolumeSpecName: "var-lock") pod "05d9779c-971f-40a0-83e9-b21a6e9e9d2a" (UID: "05d9779c-971f-40a0-83e9-b21a6e9e9d2a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.604985 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.605441 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.605532 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.605565 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir" (OuterVolumeSpecName: "ca-bundle-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "ca-bundle-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.605686 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "05d9779c-971f-40a0-83e9-b21a6e9e9d2a" (UID: "05d9779c-971f-40a0-83e9-b21a6e9e9d2a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.605763 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kubelet-dir\") pod \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\" (UID: \"05d9779c-971f-40a0-83e9-b21a6e9e9d2a\") " Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.606670 5184 reconciler_common.go:299] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.606705 5184 reconciler_common.go:299] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.606722 5184 reconciler_common.go:299] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.606737 5184 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.606751 5184 reconciler_common.go:299] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.606766 5184 reconciler_common.go:299] "Volume detached for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.608747 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.611366 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "05d9779c-971f-40a0-83e9-b21a6e9e9d2a" (UID: "05d9779c-971f-40a0-83e9-b21a6e9e9d2a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.708428 5184 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.708482 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05d9779c-971f-40a0-83e9-b21a6e9e9d2a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 16:54:25 crc kubenswrapper[5184]: E0312 16:54:25.852945 5184 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: E0312 16:54:25.853600 5184 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: E0312 16:54:25.854256 5184 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: E0312 16:54:25.854575 5184 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: E0312 16:54:25.854840 5184 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:25 crc kubenswrapper[5184]: I0312 16:54:25.854880 5184 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 12 16:54:25 crc kubenswrapper[5184]: E0312 16:54:25.855257 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 12 16:54:26 crc kubenswrapper[5184]: E0312 16:54:26.055850 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.218766 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.220645 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5" exitCode=0 Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.220696 5184 scope.go:117] "RemoveContainer" containerID="7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.220850 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.225402 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.225359 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"05d9779c-971f-40a0-83e9-b21a6e9e9d2a","Type":"ContainerDied","Data":"4e37ea05dcad2152b233f30eec956e9d2f6aa07bbc38b196bc705c942d85fbd0"} Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.226536 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e37ea05dcad2152b233f30eec956e9d2f6aa07bbc38b196bc705c942d85fbd0" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.238916 5184 scope.go:117] "RemoveContainer" containerID="965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.241144 5184 status_manager.go:895] "Failed to get status for pod" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.241877 5184 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.242495 5184 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.253547 5184 status_manager.go:895] "Failed to get status for pod" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.253968 5184 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.254274 5184 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.256818 5184 scope.go:117] "RemoveContainer" containerID="ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.273935 5184 scope.go:117] "RemoveContainer" containerID="c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.291781 5184 scope.go:117] "RemoveContainer" containerID="408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.309573 5184 scope.go:117] "RemoveContainer" containerID="85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.366578 5184 scope.go:117] "RemoveContainer" containerID="7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c" Mar 12 16:54:26 crc kubenswrapper[5184]: E0312 16:54:26.367439 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c\": container with ID starting with 7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c not found: ID does not exist" containerID="7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.367488 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c"} err="failed to get container status \"7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c\": rpc error: code = NotFound desc = could not find container \"7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c\": container with ID starting with 7b990689319f7a0b7d657ca5213acc0b3d51bd26e692405cb3c2cf8e3f4de90c not found: ID does not exist" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.367518 5184 scope.go:117] "RemoveContainer" containerID="965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5" Mar 12 16:54:26 crc kubenswrapper[5184]: E0312 16:54:26.368061 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\": container with ID starting with 965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5 not found: ID does not exist" containerID="965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.368117 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5"} err="failed to get container status \"965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\": rpc error: code = NotFound desc = could not find container \"965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5\": container with ID starting with 965856e85472800697a7882409776407f3dcefaafd9ffc6d31ca6d51466d15f5 not found: ID does not exist" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.368154 5184 scope.go:117] "RemoveContainer" containerID="ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4" Mar 12 16:54:26 crc kubenswrapper[5184]: E0312 16:54:26.368696 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\": container with ID starting with ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4 not found: ID does not exist" containerID="ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.368769 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4"} err="failed to get container status \"ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\": rpc error: code = NotFound desc = could not find container \"ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4\": container with ID starting with ba6beb26fb249f80a1c0e7a6faa3577c82429ab6acd0c17cd141a795b06adba4 not found: ID does not exist" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.368806 5184 scope.go:117] "RemoveContainer" containerID="c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51" Mar 12 16:54:26 crc kubenswrapper[5184]: E0312 16:54:26.369616 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\": container with ID starting with c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51 not found: ID does not exist" containerID="c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.369662 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51"} err="failed to get container status \"c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\": rpc error: code = NotFound desc = could not find container \"c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51\": container with ID starting with c3a3261bd304f727996d3de0ec8e9372c0f24ee323171fc078f86a529dc3ae51 not found: ID does not exist" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.369689 5184 scope.go:117] "RemoveContainer" containerID="408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5" Mar 12 16:54:26 crc kubenswrapper[5184]: E0312 16:54:26.370007 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\": container with ID starting with 408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5 not found: ID does not exist" containerID="408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.370063 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5"} err="failed to get container status \"408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\": rpc error: code = NotFound desc = could not find container \"408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5\": container with ID starting with 408cf1afe10e6c8bd0bdbe1cc632606b92ab152449ba7113c76692e36ac3f8e5 not found: ID does not exist" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.370095 5184 scope.go:117] "RemoveContainer" containerID="85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328" Mar 12 16:54:26 crc kubenswrapper[5184]: E0312 16:54:26.370413 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\": container with ID starting with 85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328 not found: ID does not exist" containerID="85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.370462 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328"} err="failed to get container status \"85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\": rpc error: code = NotFound desc = could not find container \"85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328\": container with ID starting with 85ac5f92560a2e60d997c4973bd2fd54060e553b853a3288c29cc31c11cad328 not found: ID does not exist" Mar 12 16:54:26 crc kubenswrapper[5184]: I0312 16:54:26.410404 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a14caf222afb62aaabdc47808b6f944" path="/var/lib/kubelet/pods/3a14caf222afb62aaabdc47808b6f944/volumes" Mar 12 16:54:26 crc kubenswrapper[5184]: E0312 16:54:26.456351 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 12 16:54:27 crc kubenswrapper[5184]: E0312 16:54:27.257591 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 12 16:54:28 crc kubenswrapper[5184]: I0312 16:54:28.409973 5184 status_manager.go:895] "Failed to get status for pod" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:28 crc kubenswrapper[5184]: I0312 16:54:28.410276 5184 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:28 crc kubenswrapper[5184]: E0312 16:54:28.859461 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 12 16:54:31 crc kubenswrapper[5184]: E0312 16:54:31.193119 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:54:31Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:54:31Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:54:31Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T16:54:31Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:41d99f0ef9ee39b33679e17ab323d7e7f1b14a465452ff2badb6ef7814cc3492\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d291455d87733a45c51289de77da5c47b15b76843f86b75a3bce0f0ce900562b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1727138677},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:552b27c25cf7f044e7bce229df4f8408dab691c645704995dfc37753549870b7\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ef19db7edce99ca19734eccfda7a2d04e6c33629113e89dc92063378df1d81da\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1277294194},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:200b94f6d1e4ee68e5160bd2c8618d5a81cda6fa7d3755df0326e60becf27b0b\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:9e34025adeb9cbf517918257430af85a4cb00b24540bdcf0dc1895bf917900b4\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1265618025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:2b25b4ab3e224e729bcb897a9d8b4500cb8cdf41dc4e39241fca36503dd7a6e6\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:7010a1d34012ae242b0950c830b00b3a9907b1dc17951db92c5e0d4a06d6d3a1\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1183656546},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:31 crc kubenswrapper[5184]: E0312 16:54:31.194265 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:31 crc kubenswrapper[5184]: E0312 16:54:31.194927 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:31 crc kubenswrapper[5184]: E0312 16:54:31.195329 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:31 crc kubenswrapper[5184]: E0312 16:54:31.195773 5184 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:31 crc kubenswrapper[5184]: E0312 16:54:31.195814 5184 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 12 16:54:31 crc kubenswrapper[5184]: E0312 16:54:31.300645 5184 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c264c88bbebcf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f7dbc7e1ee9c187a863ef9b473fad27b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 16:54:23.430519759 +0000 UTC m=+205.971831098,LastTimestamp:2026-03-12 16:54:23.430519759 +0000 UTC m=+205.971831098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 16:54:32 crc kubenswrapper[5184]: E0312 16:54:32.061453 5184 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="6.4s" Mar 12 16:54:34 crc kubenswrapper[5184]: I0312 16:54:34.348931 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:34 crc kubenswrapper[5184]: I0312 16:54:34.360206 5184 status_manager.go:895] "Failed to get status for pod" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:34 crc kubenswrapper[5184]: I0312 16:54:34.360938 5184 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:34 crc kubenswrapper[5184]: I0312 16:54:34.381351 5184 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="766f2ece-d155-473b-bc1e-ceca5d270675" Mar 12 16:54:34 crc kubenswrapper[5184]: I0312 16:54:34.381415 5184 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="766f2ece-d155-473b-bc1e-ceca5d270675" Mar 12 16:54:34 crc kubenswrapper[5184]: E0312 16:54:34.382032 5184 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:34 crc kubenswrapper[5184]: I0312 16:54:34.382354 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:35 crc kubenswrapper[5184]: I0312 16:54:35.373757 5184 generic.go:358] "Generic (PLEG): container finished" podID="57755cc5f99000cc11e193051474d4e2" containerID="938480d21b2cadfdad786a49acb0f4eb31594a4a6e4af41601d71dc45af154da" exitCode=0 Mar 12 16:54:35 crc kubenswrapper[5184]: I0312 16:54:35.374623 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerDied","Data":"938480d21b2cadfdad786a49acb0f4eb31594a4a6e4af41601d71dc45af154da"} Mar 12 16:54:35 crc kubenswrapper[5184]: I0312 16:54:35.374681 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"17fac5ce841d8d56a8a2480474ab2063b3894ebbff1fe54000cfd8668e5a00d4"} Mar 12 16:54:35 crc kubenswrapper[5184]: I0312 16:54:35.375454 5184 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="766f2ece-d155-473b-bc1e-ceca5d270675" Mar 12 16:54:35 crc kubenswrapper[5184]: I0312 16:54:35.375488 5184 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="766f2ece-d155-473b-bc1e-ceca5d270675" Mar 12 16:54:35 crc kubenswrapper[5184]: E0312 16:54:35.376132 5184 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:35 crc kubenswrapper[5184]: I0312 16:54:35.376162 5184 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:35 crc kubenswrapper[5184]: I0312 16:54:35.376807 5184 status_manager.go:895] "Failed to get status for pod" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 12 16:54:36 crc kubenswrapper[5184]: I0312 16:54:36.387410 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"afc529384c19f4f20060b22e75bfee445d8f1a069327a4b7b8e2f0ed9f55bd92"} Mar 12 16:54:36 crc kubenswrapper[5184]: I0312 16:54:36.387458 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"2c0d5d2d062bd9a3a555d30b00df529bc29e3b3dc832b4173dd328377627dbf9"} Mar 12 16:54:36 crc kubenswrapper[5184]: I0312 16:54:36.387469 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"80e9c96215db83ce324d527d3f4083487d5ee31c09f4bc059e3d4d3751e88a3d"} Mar 12 16:54:37 crc kubenswrapper[5184]: I0312 16:54:37.396236 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"2f46ba18eb45f15a20decc0f1ff50ceb41c006a6c1cd364355d0373e9eb4e12f"} Mar 12 16:54:37 crc kubenswrapper[5184]: I0312 16:54:37.396584 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:37 crc kubenswrapper[5184]: I0312 16:54:37.396595 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"f7405346f187c1f6f766a8c38f3ca8c3c8331b60a73ed58380c1fcdcf93d2dde"} Mar 12 16:54:37 crc kubenswrapper[5184]: I0312 16:54:37.396773 5184 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="766f2ece-d155-473b-bc1e-ceca5d270675" Mar 12 16:54:37 crc kubenswrapper[5184]: I0312 16:54:37.396808 5184 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="766f2ece-d155-473b-bc1e-ceca5d270675" Mar 12 16:54:38 crc kubenswrapper[5184]: I0312 16:54:38.403364 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 16:54:38 crc kubenswrapper[5184]: I0312 16:54:38.403419 5184 generic.go:358] "Generic (PLEG): container finished" podID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerID="d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1" exitCode=1 Mar 12 16:54:38 crc kubenswrapper[5184]: I0312 16:54:38.405911 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerDied","Data":"d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1"} Mar 12 16:54:38 crc kubenswrapper[5184]: I0312 16:54:38.406480 5184 scope.go:117] "RemoveContainer" containerID="d14cb7a12881751803c43b69b2ec33ce99548de0cb9d754e7de2f8fe301dabb1" Mar 12 16:54:39 crc kubenswrapper[5184]: I0312 16:54:39.383157 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:39 crc kubenswrapper[5184]: I0312 16:54:39.383515 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:39 crc kubenswrapper[5184]: I0312 16:54:39.388252 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:39 crc kubenswrapper[5184]: I0312 16:54:39.410629 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 16:54:39 crc kubenswrapper[5184]: I0312 16:54:39.410768 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"df7fec2b74eec2769305473412adaf8cea02f74765a76d316ff0424f34385153"} Mar 12 16:54:42 crc kubenswrapper[5184]: I0312 16:54:42.409530 5184 kubelet.go:3329] "Deleted mirror pod as it didn't match the static Pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:42 crc kubenswrapper[5184]: I0312 16:54:42.409879 5184 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:42 crc kubenswrapper[5184]: I0312 16:54:42.525934 5184 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="57755cc5f99000cc11e193051474d4e2" podUID="e6748bc4-a0f2-4900-a072-d0fe242e8c87" Mar 12 16:54:43 crc kubenswrapper[5184]: I0312 16:54:43.432356 5184 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="766f2ece-d155-473b-bc1e-ceca5d270675" Mar 12 16:54:43 crc kubenswrapper[5184]: I0312 16:54:43.432420 5184 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="766f2ece-d155-473b-bc1e-ceca5d270675" Mar 12 16:54:43 crc kubenswrapper[5184]: I0312 16:54:43.435803 5184 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="57755cc5f99000cc11e193051474d4e2" podUID="e6748bc4-a0f2-4900-a072-d0fe242e8c87" Mar 12 16:54:43 crc kubenswrapper[5184]: I0312 16:54:43.439799 5184 status_manager.go:346] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://80e9c96215db83ce324d527d3f4083487d5ee31c09f4bc059e3d4d3751e88a3d" Mar 12 16:54:43 crc kubenswrapper[5184]: I0312 16:54:43.439894 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:54:44 crc kubenswrapper[5184]: I0312 16:54:44.437363 5184 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="766f2ece-d155-473b-bc1e-ceca5d270675" Mar 12 16:54:44 crc kubenswrapper[5184]: I0312 16:54:44.437575 5184 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="766f2ece-d155-473b-bc1e-ceca5d270675" Mar 12 16:54:44 crc kubenswrapper[5184]: I0312 16:54:44.443049 5184 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="57755cc5f99000cc11e193051474d4e2" podUID="e6748bc4-a0f2-4900-a072-d0fe242e8c87" Mar 12 16:54:45 crc kubenswrapper[5184]: I0312 16:54:45.537022 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:54:45 crc kubenswrapper[5184]: I0312 16:54:45.543150 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:54:46 crc kubenswrapper[5184]: I0312 16:54:46.452985 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:54:50 crc kubenswrapper[5184]: I0312 16:54:50.743070 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:54:50 crc kubenswrapper[5184]: I0312 16:54:50.744549 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:54:52 crc kubenswrapper[5184]: I0312 16:54:52.587114 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-rbac-proxy\"" Mar 12 16:54:53 crc kubenswrapper[5184]: I0312 16:54:53.176172 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"kube-root-ca.crt\"" Mar 12 16:54:53 crc kubenswrapper[5184]: I0312 16:54:53.878738 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\"" Mar 12 16:54:53 crc kubenswrapper[5184]: I0312 16:54:53.951041 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"kube-root-ca.crt\"" Mar 12 16:54:54 crc kubenswrapper[5184]: I0312 16:54:54.247779 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"dns-operator-dockercfg-wbbsn\"" Mar 12 16:54:54 crc kubenswrapper[5184]: I0312 16:54:54.292746 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"openshift-service-ca.crt\"" Mar 12 16:54:54 crc kubenswrapper[5184]: I0312 16:54:54.494905 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-config\"" Mar 12 16:54:54 crc kubenswrapper[5184]: I0312 16:54:54.494989 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-dockercfg-4vdnc\"" Mar 12 16:54:54 crc kubenswrapper[5184]: I0312 16:54:54.496213 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\"" Mar 12 16:54:54 crc kubenswrapper[5184]: I0312 16:54:54.534955 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:54:54 crc kubenswrapper[5184]: I0312 16:54:54.931541 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"trusted-ca-bundle\"" Mar 12 16:54:54 crc kubenswrapper[5184]: I0312 16:54:54.959345 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 12 16:54:54 crc kubenswrapper[5184]: I0312 16:54:54.979299 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-sjn6s\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.139802 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-node-identity\"/\"network-node-identity-cert\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.267231 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.280738 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.359920 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.400549 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.667618 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.692176 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.697557 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tk7bt\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.715030 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.783714 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g6kgg\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.804924 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 12 16:54:55 crc kubenswrapper[5184]: I0312 16:54:55.958854 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-serving-cert\"" Mar 12 16:54:56 crc kubenswrapper[5184]: I0312 16:54:56.317419 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 12 16:54:56 crc kubenswrapper[5184]: I0312 16:54:56.396105 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"image-import-ca\"" Mar 12 16:54:56 crc kubenswrapper[5184]: I0312 16:54:56.495938 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\"" Mar 12 16:54:56 crc kubenswrapper[5184]: I0312 16:54:56.517278 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"service-ca-bundle\"" Mar 12 16:54:56 crc kubenswrapper[5184]: I0312 16:54:56.599277 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 12 16:54:56 crc kubenswrapper[5184]: I0312 16:54:56.662427 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 12 16:54:56 crc kubenswrapper[5184]: I0312 16:54:56.666449 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"metrics-tls\"" Mar 12 16:54:56 crc kubenswrapper[5184]: I0312 16:54:56.810465 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 12 16:54:56 crc kubenswrapper[5184]: I0312 16:54:56.863660 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\"" Mar 12 16:54:56 crc kubenswrapper[5184]: I0312 16:54:56.947563 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-client\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.086192 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.122799 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-w9nzh\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.145078 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"kube-root-ca.crt\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.245539 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.323763 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.375066 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"metrics-tls\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.428544 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.464050 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.635832 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"hostpath-provisioner\"/\"csi-hostpath-provisioner-sa-dockercfg-7dcws\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.648807 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.664108 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.720371 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-mmcpt\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.798342 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.937081 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 12 16:54:57 crc kubenswrapper[5184]: I0312 16:54:57.997702 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.020048 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ac-dockercfg-gj7jx\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.062048 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.138103 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.140705 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"openshift-service-ca.crt\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.233915 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2h6bs\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.284466 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.389901 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"node-bootstrapper-token\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.543751 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tjs74\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.609039 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.647531 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"etcd-serving-ca\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.721765 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-tls\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.779636 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.819352 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 12 16:54:58 crc kubenswrapper[5184]: I0312 16:54:58.846954 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.000575 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"env-overrides\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.090963 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-djmfg\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.149758 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.181164 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.182913 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.268578 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.271504 5184 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.365732 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-ca-bundle\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.377148 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-6n5ln\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.392015 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.461074 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-dockercfg-gnx66\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.500992 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.546681 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"audit-1\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.636355 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"client-ca\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.683555 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.703996 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.719239 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"marketplace-trusted-ca\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.719864 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.740318 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"kube-root-ca.crt\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.751282 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bgxvm\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.793361 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"serving-cert\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.903826 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-dockercfg-tnfx9\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.955074 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-dockercfg-dzw6b\"" Mar 12 16:54:59 crc kubenswrapper[5184]: I0312 16:54:59.983365 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-wzhvk\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.057484 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-tls\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.072709 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.078782 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-6tbpn\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.097730 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-config\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.165318 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.173560 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pgs7\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.181635 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"etcd-client\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.205284 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"audit\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.227430 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.250434 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-config\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.279437 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"openshift-service-ca.crt\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.285764 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"trusted-ca\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.285815 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.332806 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.421612 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.434840 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.523710 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.532462 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.555758 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"ovnkube-identity-cm\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.581801 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8dkm8\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.628230 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"encryption-config-1\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.742609 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kl6m8\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.804760 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"pprof-cert\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.823005 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.873236 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"package-server-manager-serving-cert\"" Mar 12 16:55:00 crc kubenswrapper[5184]: I0312 16:55:00.949974 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"kube-root-ca.crt\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.102463 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.233498 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.265827 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.270818 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bjqfd\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.280898 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.319885 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"kube-root-ca.crt\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.374352 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-images\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.376110 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-serving-cert\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.389216 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kw8fx\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.402681 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mdwwj\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.422144 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.526842 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.656691 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-dockercfg-bf7fj\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.763954 5184 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.866407 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nwglk\"" Mar 12 16:55:01 crc kubenswrapper[5184]: I0312 16:55:01.971138 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l2v2m\"" Mar 12 16:55:02 crc kubenswrapper[5184]: I0312 16:55:02.144855 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-config\"" Mar 12 16:55:02 crc kubenswrapper[5184]: I0312 16:55:02.204077 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-config\"" Mar 12 16:55:02 crc kubenswrapper[5184]: I0312 16:55:02.204189 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"serving-cert\"" Mar 12 16:55:02 crc kubenswrapper[5184]: I0312 16:55:02.259465 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-tls\"" Mar 12 16:55:02 crc kubenswrapper[5184]: I0312 16:55:02.299119 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 12 16:55:02 crc kubenswrapper[5184]: I0312 16:55:02.319760 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 12 16:55:02 crc kubenswrapper[5184]: I0312 16:55:02.443501 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"config-operator-serving-cert\"" Mar 12 16:55:02 crc kubenswrapper[5184]: I0312 16:55:02.618930 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"proxy-tls\"" Mar 12 16:55:02 crc kubenswrapper[5184]: I0312 16:55:02.672564 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serviceaccount-dockercfg-4gqzj\"" Mar 12 16:55:02 crc kubenswrapper[5184]: I0312 16:55:02.719561 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-service-ca-bundle\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.064303 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.068155 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.099295 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-nl8tp\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.111656 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"metrics-tls\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.119009 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-config\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.232689 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"client-ca\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.428048 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.466461 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.472649 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"kube-root-ca.crt\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.493016 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.548901 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-controller-dockercfg-xnj77\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.609481 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.663981 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-scheduler-operator-serving-cert\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.730030 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.733099 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.872634 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.898724 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"ingress-operator-dockercfg-74nwh\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.932769 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-qqw4z\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.968780 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jmhxf\"" Mar 12 16:55:03 crc kubenswrapper[5184]: I0312 16:55:03.974298 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"serving-cert\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.017739 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-d2bf2\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.052472 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.073794 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-images\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.078810 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.102549 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"catalog-operator-serving-cert\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.131780 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.210692 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.241901 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.244244 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.333472 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.375045 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.471967 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"config\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.484919 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-dockercfg-2wbn2\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.486065 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.547193 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.603014 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.685649 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.735009 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.769436 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serving-cert\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.835352 5184 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.898722 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-dockercfg-sw6nc\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.914603 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.920554 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.932514 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-serving-cert\"" Mar 12 16:55:04 crc kubenswrapper[5184]: I0312 16:55:04.974197 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.005802 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-config\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.054674 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-certs-default\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.057410 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.070157 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.114176 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.199912 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.223303 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.252982 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.255551 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.311929 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kpvmz\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.349715 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mcc-proxy-tls\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.363159 5184 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.378892 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-4zqgh\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.462616 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"config\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.473888 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.569440 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.571946 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"default-dockercfg-hqpm5\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.580685 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"serving-cert\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.597956 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.636289 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"openshift-service-ca.crt\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.699661 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.723774 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.808458 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kknhg\"" Mar 12 16:55:05 crc kubenswrapper[5184]: I0312 16:55:05.972153 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.003813 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.010279 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.095781 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.101109 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.105092 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.141834 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.158987 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.203067 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.445362 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"cluster-image-registry-operator-dockercfg-ntnd7\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.499771 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-dockercfg-2cfkp\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.522949 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-session\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.589489 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.636073 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mco-proxy-tls\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.718593 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.748869 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.821939 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-root-ca.crt\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.847119 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.909009 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:06 crc kubenswrapper[5184]: I0312 16:55:06.971879 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.165086 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.196637 5184 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.197951 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.197917188 podStartE2EDuration="44.197917188s" podCreationTimestamp="2026-03-12 16:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:54:42.478767729 +0000 UTC m=+225.020079068" watchObservedRunningTime="2026-03-12 16:55:07.197917188 +0000 UTC m=+249.739228517" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.202577 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.202634 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.209046 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.210475 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-dockercfg-jcmfj\"" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.233237 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.233218303 podStartE2EDuration="25.233218303s" podCreationTimestamp="2026-03-12 16:54:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:55:07.228602564 +0000 UTC m=+249.769913903" watchObservedRunningTime="2026-03-12 16:55:07.233218303 +0000 UTC m=+249.774529632" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.239506 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.381222 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-6c46w\"" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.555504 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"packageserver-service-cert\"" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.642179 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"audit-1\"" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.784826 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"serving-cert\"" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.816774 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\"" Mar 12 16:55:07 crc kubenswrapper[5184]: I0312 16:55:07.925996 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 12 16:55:08 crc kubenswrapper[5184]: I0312 16:55:08.119515 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\"" Mar 12 16:55:08 crc kubenswrapper[5184]: I0312 16:55:08.176732 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-operator-tls\"" Mar 12 16:55:08 crc kubenswrapper[5184]: I0312 16:55:08.466673 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-global-ca\"" Mar 12 16:55:08 crc kubenswrapper[5184]: I0312 16:55:08.836747 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"config\"" Mar 12 16:55:08 crc kubenswrapper[5184]: I0312 16:55:08.974605 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\"" Mar 12 16:55:09 crc kubenswrapper[5184]: I0312 16:55:09.132141 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 12 16:55:09 crc kubenswrapper[5184]: I0312 16:55:09.172451 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 12 16:55:09 crc kubenswrapper[5184]: I0312 16:55:09.369785 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 12 16:55:10 crc kubenswrapper[5184]: I0312 16:55:10.024136 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t8n29\"" Mar 12 16:55:10 crc kubenswrapper[5184]: I0312 16:55:10.069097 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-serving-cert\"" Mar 12 16:55:10 crc kubenswrapper[5184]: I0312 16:55:10.275259 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\"" Mar 12 16:55:10 crc kubenswrapper[5184]: I0312 16:55:10.320885 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-admission-controller-secret\"" Mar 12 16:55:10 crc kubenswrapper[5184]: I0312 16:55:10.610662 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Mar 12 16:55:10 crc kubenswrapper[5184]: I0312 16:55:10.638122 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 12 16:55:11 crc kubenswrapper[5184]: I0312 16:55:11.026255 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-client\"" Mar 12 16:55:12 crc kubenswrapper[5184]: I0312 16:55:12.244817 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-metrics\"" Mar 12 16:55:16 crc kubenswrapper[5184]: I0312 16:55:16.052552 5184 kubelet.go:2547] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 16:55:16 crc kubenswrapper[5184]: I0312 16:55:16.053233 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" containerID="cri-o://f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e" gracePeriod=5 Mar 12 16:55:20 crc kubenswrapper[5184]: I0312 16:55:20.742702 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:55:20 crc kubenswrapper[5184]: I0312 16:55:20.743047 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.630061 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f7dbc7e1ee9c187a863ef9b473fad27b/startup-monitor/0.log" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.630564 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.679785 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.679828 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.679876 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.679899 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.679943 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.679994 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests" (OuterVolumeSpecName: "manifests") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.680032 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock" (OuterVolumeSpecName: "var-lock") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.680061 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.680140 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log" (OuterVolumeSpecName: "var-log") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.680622 5184 reconciler_common.go:299] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") on node \"crc\" DevicePath \"\"" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.680639 5184 reconciler_common.go:299] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") on node \"crc\" DevicePath \"\"" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.680651 5184 reconciler_common.go:299] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.680664 5184 reconciler_common.go:299] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") on node \"crc\" DevicePath \"\"" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.692574 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.781872 5184 reconciler_common.go:299] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.816680 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f7dbc7e1ee9c187a863ef9b473fad27b/startup-monitor/0.log" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.816746 5184 generic.go:358] "Generic (PLEG): container finished" podID="f7dbc7e1ee9c187a863ef9b473fad27b" containerID="f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e" exitCode=137 Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.816799 5184 scope.go:117] "RemoveContainer" containerID="f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.816931 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.844043 5184 scope.go:117] "RemoveContainer" containerID="f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e" Mar 12 16:55:21 crc kubenswrapper[5184]: E0312 16:55:21.844749 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e\": container with ID starting with f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e not found: ID does not exist" containerID="f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e" Mar 12 16:55:21 crc kubenswrapper[5184]: I0312 16:55:21.844810 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e"} err="failed to get container status \"f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e\": rpc error: code = NotFound desc = could not find container \"f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e\": container with ID starting with f39418c66ff49f4f10718eab746afa4889f8ba40b32e6645da7c24e3c75c536e not found: ID does not exist" Mar 12 16:55:22 crc kubenswrapper[5184]: I0312 16:55:22.411048 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" path="/var/lib/kubelet/pods/f7dbc7e1ee9c187a863ef9b473fad27b/volumes" Mar 12 16:55:22 crc kubenswrapper[5184]: I0312 16:55:22.411827 5184 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 12 16:55:22 crc kubenswrapper[5184]: I0312 16:55:22.429625 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 16:55:22 crc kubenswrapper[5184]: I0312 16:55:22.429686 5184 kubelet.go:2759] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ba87400f-a817-4daf-b4af-15cdd74218ad" Mar 12 16:55:22 crc kubenswrapper[5184]: I0312 16:55:22.436755 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 16:55:22 crc kubenswrapper[5184]: I0312 16:55:22.436819 5184 kubelet.go:2784] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="ba87400f-a817-4daf-b4af-15cdd74218ad" Mar 12 16:55:42 crc kubenswrapper[5184]: I0312 16:55:42.368939 5184 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 16:55:50 crc kubenswrapper[5184]: I0312 16:55:50.742731 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:55:50 crc kubenswrapper[5184]: I0312 16:55:50.744590 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:55:50 crc kubenswrapper[5184]: I0312 16:55:50.744673 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:55:50 crc kubenswrapper[5184]: I0312 16:55:50.745644 5184 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a794500127db524b745f6dfb40cb4c4c83a065628e7edf1a8c68e379958a7834"} pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 16:55:50 crc kubenswrapper[5184]: I0312 16:55:50.745725 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" containerID="cri-o://a794500127db524b745f6dfb40cb4c4c83a065628e7edf1a8c68e379958a7834" gracePeriod=600 Mar 12 16:55:51 crc kubenswrapper[5184]: I0312 16:55:51.338326 5184 generic.go:358] "Generic (PLEG): container finished" podID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerID="a794500127db524b745f6dfb40cb4c4c83a065628e7edf1a8c68e379958a7834" exitCode=0 Mar 12 16:55:51 crc kubenswrapper[5184]: I0312 16:55:51.338429 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerDied","Data":"a794500127db524b745f6dfb40cb4c4c83a065628e7edf1a8c68e379958a7834"} Mar 12 16:55:51 crc kubenswrapper[5184]: I0312 16:55:51.339152 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"7ce931eac957036c6c965318bd6ebe196835262d045725f0735bb9f9799bfd42"} Mar 12 16:55:58 crc kubenswrapper[5184]: I0312 16:55:58.599074 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 16:55:58 crc kubenswrapper[5184]: I0312 16:55:58.599964 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.160324 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555576-8pcks"] Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.161828 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" containerName="installer" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.161849 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" containerName="installer" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.161866 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.161873 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.161999 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.162013 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="05d9779c-971f-40a0-83e9-b21a6e9e9d2a" containerName="installer" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.167812 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555576-8pcks" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.170690 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.170954 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.171048 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.172839 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555576-8pcks"] Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.298481 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmk49\" (UniqueName: \"kubernetes.io/projected/b4b9baef-c7b2-4789-a606-0b76f4a575c4-kube-api-access-qmk49\") pod \"auto-csr-approver-29555576-8pcks\" (UID: \"b4b9baef-c7b2-4789-a606-0b76f4a575c4\") " pod="openshift-infra/auto-csr-approver-29555576-8pcks" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.401118 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmk49\" (UniqueName: \"kubernetes.io/projected/b4b9baef-c7b2-4789-a606-0b76f4a575c4-kube-api-access-qmk49\") pod \"auto-csr-approver-29555576-8pcks\" (UID: \"b4b9baef-c7b2-4789-a606-0b76f4a575c4\") " pod="openshift-infra/auto-csr-approver-29555576-8pcks" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.429058 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmk49\" (UniqueName: \"kubernetes.io/projected/b4b9baef-c7b2-4789-a606-0b76f4a575c4-kube-api-access-qmk49\") pod \"auto-csr-approver-29555576-8pcks\" (UID: \"b4b9baef-c7b2-4789-a606-0b76f4a575c4\") " pod="openshift-infra/auto-csr-approver-29555576-8pcks" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.486017 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555576-8pcks" Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.909443 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555576-8pcks"] Mar 12 16:56:00 crc kubenswrapper[5184]: W0312 16:56:00.913646 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4b9baef_c7b2_4789_a606_0b76f4a575c4.slice/crio-ffb5a1aa0a336b9d13583667fcca3ecb54ed4ad1575d1128143025f203f2e3b8 WatchSource:0}: Error finding container ffb5a1aa0a336b9d13583667fcca3ecb54ed4ad1575d1128143025f203f2e3b8: Status 404 returned error can't find the container with id ffb5a1aa0a336b9d13583667fcca3ecb54ed4ad1575d1128143025f203f2e3b8 Mar 12 16:56:00 crc kubenswrapper[5184]: I0312 16:56:00.915755 5184 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.418787 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555576-8pcks" event={"ID":"b4b9baef-c7b2-4789-a606-0b76f4a575c4","Type":"ContainerStarted","Data":"ffb5a1aa0a336b9d13583667fcca3ecb54ed4ad1575d1128143025f203f2e3b8"} Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.540944 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwnbt"] Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.541692 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hwnbt" podUID="27cbd345-0044-49d8-9192-d193df4c579e" containerName="registry-server" containerID="cri-o://ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f" gracePeriod=30 Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.544934 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nk4rr"] Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.545227 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nk4rr" podUID="ce605906-7727-4682-83a5-e18f9faeb789" containerName="registry-server" containerID="cri-o://38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1" gracePeriod=30 Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.564765 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-dpld6"] Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.565464 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" podUID="5ad036a8-381e-4761-a20f-8d8b9a3e9408" containerName="marketplace-operator" containerID="cri-o://babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2" gracePeriod=30 Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.573840 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6mx8"] Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.574179 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h6mx8" podUID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerName="registry-server" containerID="cri-o://7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260" gracePeriod=30 Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.586640 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cx6mk"] Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.587008 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cx6mk" podUID="45fdd519-1630-4afa-9780-7325691d8206" containerName="registry-server" containerID="cri-o://45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba" gracePeriod=30 Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.606609 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-hpf6l"] Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.609933 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.620941 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-hpf6l"] Mar 12 16:56:01 crc kubenswrapper[5184]: E0312 16:56:01.701511 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba is running failed: container process not found" containerID="45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 16:56:01 crc kubenswrapper[5184]: E0312 16:56:01.702028 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba is running failed: container process not found" containerID="45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 16:56:01 crc kubenswrapper[5184]: E0312 16:56:01.702315 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba is running failed: container process not found" containerID="45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba" cmd=["grpc_health_probe","-addr=:50051"] Mar 12 16:56:01 crc kubenswrapper[5184]: E0312 16:56:01.702355 5184 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-cx6mk" podUID="45fdd519-1630-4afa-9780-7325691d8206" containerName="registry-server" probeResult="unknown" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.722335 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.722433 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.722459 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvqrr\" (UniqueName: \"kubernetes.io/projected/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-kube-api-access-rvqrr\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.722481 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-tmp\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.825009 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.825051 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvqrr\" (UniqueName: \"kubernetes.io/projected/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-kube-api-access-rvqrr\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.825073 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-tmp\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.825109 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.826139 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-tmp\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.826796 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.847945 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.858623 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvqrr\" (UniqueName: \"kubernetes.io/projected/5ecb4f29-01ec-4c15-8455-30a8b8623f6d-kube-api-access-rvqrr\") pod \"marketplace-operator-547dbd544d-hpf6l\" (UID: \"5ecb4f29-01ec-4c15-8455-30a8b8623f6d\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.949399 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:01 crc kubenswrapper[5184]: I0312 16:56:01.952313 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.026902 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-utilities\") pod \"ce605906-7727-4682-83a5-e18f9faeb789\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.027083 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzv8k\" (UniqueName: \"kubernetes.io/projected/ce605906-7727-4682-83a5-e18f9faeb789-kube-api-access-gzv8k\") pod \"ce605906-7727-4682-83a5-e18f9faeb789\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.027145 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-catalog-content\") pod \"ce605906-7727-4682-83a5-e18f9faeb789\" (UID: \"ce605906-7727-4682-83a5-e18f9faeb789\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.030282 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-utilities" (OuterVolumeSpecName: "utilities") pod "ce605906-7727-4682-83a5-e18f9faeb789" (UID: "ce605906-7727-4682-83a5-e18f9faeb789"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.033782 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce605906-7727-4682-83a5-e18f9faeb789-kube-api-access-gzv8k" (OuterVolumeSpecName: "kube-api-access-gzv8k") pod "ce605906-7727-4682-83a5-e18f9faeb789" (UID: "ce605906-7727-4682-83a5-e18f9faeb789"). InnerVolumeSpecName "kube-api-access-gzv8k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.042348 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.095577 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce605906-7727-4682-83a5-e18f9faeb789" (UID: "ce605906-7727-4682-83a5-e18f9faeb789"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.132782 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-catalog-content\") pod \"27cbd345-0044-49d8-9192-d193df4c579e\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.132892 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-utilities\") pod \"27cbd345-0044-49d8-9192-d193df4c579e\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.132922 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58dtg\" (UniqueName: \"kubernetes.io/projected/27cbd345-0044-49d8-9192-d193df4c579e-kube-api-access-58dtg\") pod \"27cbd345-0044-49d8-9192-d193df4c579e\" (UID: \"27cbd345-0044-49d8-9192-d193df4c579e\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.133141 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.133157 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gzv8k\" (UniqueName: \"kubernetes.io/projected/ce605906-7727-4682-83a5-e18f9faeb789-kube-api-access-gzv8k\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.133167 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce605906-7727-4682-83a5-e18f9faeb789-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.135233 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-utilities" (OuterVolumeSpecName: "utilities") pod "27cbd345-0044-49d8-9192-d193df4c579e" (UID: "27cbd345-0044-49d8-9192-d193df4c579e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.147514 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27cbd345-0044-49d8-9192-d193df4c579e-kube-api-access-58dtg" (OuterVolumeSpecName: "kube-api-access-58dtg") pod "27cbd345-0044-49d8-9192-d193df4c579e" (UID: "27cbd345-0044-49d8-9192-d193df4c579e"). InnerVolumeSpecName "kube-api-access-58dtg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.176179 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.212701 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27cbd345-0044-49d8-9192-d193df4c579e" (UID: "27cbd345-0044-49d8-9192-d193df4c579e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.216611 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.217821 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.226668 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-hpf6l"] Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.233588 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-utilities\") pod \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.233758 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sslkr\" (UniqueName: \"kubernetes.io/projected/8df70bc7-e513-41bf-94d8-5f79a9d10b64-kube-api-access-sslkr\") pod \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.234410 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-catalog-content\") pod \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\" (UID: \"8df70bc7-e513-41bf-94d8-5f79a9d10b64\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.234922 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.234945 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-58dtg\" (UniqueName: \"kubernetes.io/projected/27cbd345-0044-49d8-9192-d193df4c579e-kube-api-access-58dtg\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.234959 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27cbd345-0044-49d8-9192-d193df4c579e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.237615 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-utilities" (OuterVolumeSpecName: "utilities") pod "8df70bc7-e513-41bf-94d8-5f79a9d10b64" (UID: "8df70bc7-e513-41bf-94d8-5f79a9d10b64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.247301 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df70bc7-e513-41bf-94d8-5f79a9d10b64-kube-api-access-sslkr" (OuterVolumeSpecName: "kube-api-access-sslkr") pod "8df70bc7-e513-41bf-94d8-5f79a9d10b64" (UID: "8df70bc7-e513-41bf-94d8-5f79a9d10b64"). InnerVolumeSpecName "kube-api-access-sslkr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.269064 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8df70bc7-e513-41bf-94d8-5f79a9d10b64" (UID: "8df70bc7-e513-41bf-94d8-5f79a9d10b64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.335726 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-utilities\") pod \"45fdd519-1630-4afa-9780-7325691d8206\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336034 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfz9t\" (UniqueName: \"kubernetes.io/projected/5ad036a8-381e-4761-a20f-8d8b9a3e9408-kube-api-access-kfz9t\") pod \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336069 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv8bf\" (UniqueName: \"kubernetes.io/projected/45fdd519-1630-4afa-9780-7325691d8206-kube-api-access-qv8bf\") pod \"45fdd519-1630-4afa-9780-7325691d8206\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336242 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-trusted-ca\") pod \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336464 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ad036a8-381e-4761-a20f-8d8b9a3e9408-tmp\") pod \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336497 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-operator-metrics\") pod \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\" (UID: \"5ad036a8-381e-4761-a20f-8d8b9a3e9408\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336513 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-catalog-content\") pod \"45fdd519-1630-4afa-9780-7325691d8206\" (UID: \"45fdd519-1630-4afa-9780-7325691d8206\") " Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336667 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336677 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8df70bc7-e513-41bf-94d8-5f79a9d10b64-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336685 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sslkr\" (UniqueName: \"kubernetes.io/projected/8df70bc7-e513-41bf-94d8-5f79a9d10b64-kube-api-access-sslkr\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336683 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-utilities" (OuterVolumeSpecName: "utilities") pod "45fdd519-1630-4afa-9780-7325691d8206" (UID: "45fdd519-1630-4afa-9780-7325691d8206"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.336983 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ad036a8-381e-4761-a20f-8d8b9a3e9408-tmp" (OuterVolumeSpecName: "tmp") pod "5ad036a8-381e-4761-a20f-8d8b9a3e9408" (UID: "5ad036a8-381e-4761-a20f-8d8b9a3e9408"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.337127 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "5ad036a8-381e-4761-a20f-8d8b9a3e9408" (UID: "5ad036a8-381e-4761-a20f-8d8b9a3e9408"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.339567 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45fdd519-1630-4afa-9780-7325691d8206-kube-api-access-qv8bf" (OuterVolumeSpecName: "kube-api-access-qv8bf") pod "45fdd519-1630-4afa-9780-7325691d8206" (UID: "45fdd519-1630-4afa-9780-7325691d8206"). InnerVolumeSpecName "kube-api-access-qv8bf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.339955 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "5ad036a8-381e-4761-a20f-8d8b9a3e9408" (UID: "5ad036a8-381e-4761-a20f-8d8b9a3e9408"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.340498 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ad036a8-381e-4761-a20f-8d8b9a3e9408-kube-api-access-kfz9t" (OuterVolumeSpecName: "kube-api-access-kfz9t") pod "5ad036a8-381e-4761-a20f-8d8b9a3e9408" (UID: "5ad036a8-381e-4761-a20f-8d8b9a3e9408"). InnerVolumeSpecName "kube-api-access-kfz9t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.437955 5184 reconciler_common.go:299] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.438289 5184 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ad036a8-381e-4761-a20f-8d8b9a3e9408-tmp\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.438304 5184 reconciler_common.go:299] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ad036a8-381e-4761-a20f-8d8b9a3e9408-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.438316 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.438329 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kfz9t\" (UniqueName: \"kubernetes.io/projected/5ad036a8-381e-4761-a20f-8d8b9a3e9408-kube-api-access-kfz9t\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.438342 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qv8bf\" (UniqueName: \"kubernetes.io/projected/45fdd519-1630-4afa-9780-7325691d8206-kube-api-access-qv8bf\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.445111 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45fdd519-1630-4afa-9780-7325691d8206" (UID: "45fdd519-1630-4afa-9780-7325691d8206"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.447445 5184 generic.go:358] "Generic (PLEG): container finished" podID="45fdd519-1630-4afa-9780-7325691d8206" containerID="45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba" exitCode=0 Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.447512 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx6mk" event={"ID":"45fdd519-1630-4afa-9780-7325691d8206","Type":"ContainerDied","Data":"45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.447566 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cx6mk" event={"ID":"45fdd519-1630-4afa-9780-7325691d8206","Type":"ContainerDied","Data":"4ea01e3d73dec474a2b62419f36d08c7bdd9b848d9d785f3349dbe9490ad3a91"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.447584 5184 scope.go:117] "RemoveContainer" containerID="45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.447638 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cx6mk" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.452980 5184 generic.go:358] "Generic (PLEG): container finished" podID="ce605906-7727-4682-83a5-e18f9faeb789" containerID="38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1" exitCode=0 Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.453059 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4rr" event={"ID":"ce605906-7727-4682-83a5-e18f9faeb789","Type":"ContainerDied","Data":"38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.453084 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4rr" event={"ID":"ce605906-7727-4682-83a5-e18f9faeb789","Type":"ContainerDied","Data":"e1058d2442fe7b9790b303f5f5adcf5cbc9e0b1f1c0606ef603ca1c53eb51671"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.453167 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nk4rr" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.459210 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6mx8" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.459220 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6mx8" event={"ID":"8df70bc7-e513-41bf-94d8-5f79a9d10b64","Type":"ContainerDied","Data":"7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.459305 5184 generic.go:358] "Generic (PLEG): container finished" podID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerID="7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260" exitCode=0 Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.459564 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6mx8" event={"ID":"8df70bc7-e513-41bf-94d8-5f79a9d10b64","Type":"ContainerDied","Data":"05937773a3d9af59d71300ee09b09ae8c3378a952416db356f2fd084a0384e70"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.463344 5184 generic.go:358] "Generic (PLEG): container finished" podID="5ad036a8-381e-4761-a20f-8d8b9a3e9408" containerID="babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2" exitCode=0 Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.463472 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" event={"ID":"5ad036a8-381e-4761-a20f-8d8b9a3e9408","Type":"ContainerDied","Data":"babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.463497 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" event={"ID":"5ad036a8-381e-4761-a20f-8d8b9a3e9408","Type":"ContainerDied","Data":"73a2dcd0415c36c1f0542204ea1babe908c9e7a2189f4c874f45c0760ae543c4"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.463564 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-dpld6" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.465893 5184 generic.go:358] "Generic (PLEG): container finished" podID="27cbd345-0044-49d8-9192-d193df4c579e" containerID="ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f" exitCode=0 Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.465957 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwnbt" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.466160 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnbt" event={"ID":"27cbd345-0044-49d8-9192-d193df4c579e","Type":"ContainerDied","Data":"ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.466756 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwnbt" event={"ID":"27cbd345-0044-49d8-9192-d193df4c579e","Type":"ContainerDied","Data":"c372d508f0c1a43116cb86b006c3f96f27219d02cc05f5feca0d4f0deb7326fe"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.469725 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" event={"ID":"5ecb4f29-01ec-4c15-8455-30a8b8623f6d","Type":"ContainerStarted","Data":"16cf4cd32d7197c1511b656829f27b9a0bdd8ea3cf473b0ab2dba51b44c1bc4d"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.469765 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" event={"ID":"5ecb4f29-01ec-4c15-8455-30a8b8623f6d","Type":"ContainerStarted","Data":"f7fbc798739f89702aa822c201eb0c1fa3a6067e22a851076a1a3488afc724a2"} Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.470600 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.472233 5184 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-hpf6l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.69:8080/healthz\": dial tcp 10.217.0.69:8080: connect: connection refused" start-of-body= Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.472283 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" podUID="5ecb4f29-01ec-4c15-8455-30a8b8623f6d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.69:8080/healthz\": dial tcp 10.217.0.69:8080: connect: connection refused" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.476808 5184 scope.go:117] "RemoveContainer" containerID="71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.505249 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" podStartSLOduration=1.505231862 podStartE2EDuration="1.505231862s" podCreationTimestamp="2026-03-12 16:56:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 16:56:02.502576456 +0000 UTC m=+305.043887795" watchObservedRunningTime="2026-03-12 16:56:02.505231862 +0000 UTC m=+305.046543221" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.517291 5184 scope.go:117] "RemoveContainer" containerID="4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.518591 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nk4rr"] Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.522576 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nk4rr"] Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.530881 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cx6mk"] Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.535202 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cx6mk"] Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.539887 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45fdd519-1630-4afa-9780-7325691d8206-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.541663 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6mx8"] Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.546535 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6mx8"] Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.548529 5184 scope.go:117] "RemoveContainer" containerID="45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.551818 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwnbt"] Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.552010 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba\": container with ID starting with 45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba not found: ID does not exist" containerID="45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.552054 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba"} err="failed to get container status \"45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba\": rpc error: code = NotFound desc = could not find container \"45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba\": container with ID starting with 45b01fa1687f48aec19b380db4e47ef576619e14bfd9357453759de15ad3e3ba not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.552082 5184 scope.go:117] "RemoveContainer" containerID="71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.553363 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5\": container with ID starting with 71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5 not found: ID does not exist" containerID="71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.553427 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5"} err="failed to get container status \"71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5\": rpc error: code = NotFound desc = could not find container \"71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5\": container with ID starting with 71f593847d718ebf0236efe4e0c4daa8926b524ae4b680eb6a6c7f1f90f2dba5 not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.553454 5184 scope.go:117] "RemoveContainer" containerID="4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.553825 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d\": container with ID starting with 4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d not found: ID does not exist" containerID="4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.553852 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d"} err="failed to get container status \"4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d\": rpc error: code = NotFound desc = could not find container \"4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d\": container with ID starting with 4e652cc7bd8111fa6ae207645ca7e190e077446464c7c723c08598922e17027d not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.553883 5184 scope.go:117] "RemoveContainer" containerID="38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.556278 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hwnbt"] Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.569986 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-dpld6"] Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.576017 5184 scope.go:117] "RemoveContainer" containerID="dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.578671 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-dpld6"] Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.592499 5184 scope.go:117] "RemoveContainer" containerID="9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.610362 5184 scope.go:117] "RemoveContainer" containerID="38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.611334 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1\": container with ID starting with 38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1 not found: ID does not exist" containerID="38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.611395 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1"} err="failed to get container status \"38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1\": rpc error: code = NotFound desc = could not find container \"38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1\": container with ID starting with 38e56a6b1c7063f2e3c6bb54d938ed57f9434b17f77a082c02bef910d5358da1 not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.611425 5184 scope.go:117] "RemoveContainer" containerID="dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.611719 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b\": container with ID starting with dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b not found: ID does not exist" containerID="dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.611759 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b"} err="failed to get container status \"dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b\": rpc error: code = NotFound desc = could not find container \"dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b\": container with ID starting with dd1913dbe819be5f4c8b9c603a67d678f8b7d37f07e9aea2a7fd4a66ad2b5b1b not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.611784 5184 scope.go:117] "RemoveContainer" containerID="9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.612132 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf\": container with ID starting with 9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf not found: ID does not exist" containerID="9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.612161 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf"} err="failed to get container status \"9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf\": rpc error: code = NotFound desc = could not find container \"9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf\": container with ID starting with 9aaee5914a38ad6225853fbcb0312ab339cd1fa35396e860bd749582ef4494cf not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.612180 5184 scope.go:117] "RemoveContainer" containerID="7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.627307 5184 scope.go:117] "RemoveContainer" containerID="50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.650105 5184 scope.go:117] "RemoveContainer" containerID="a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.665918 5184 scope.go:117] "RemoveContainer" containerID="7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.666272 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260\": container with ID starting with 7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260 not found: ID does not exist" containerID="7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.666303 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260"} err="failed to get container status \"7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260\": rpc error: code = NotFound desc = could not find container \"7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260\": container with ID starting with 7337229473e6aab1dfd2030dcf3b2053be4c0546e8ed408adadf15c56f5ed260 not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.666321 5184 scope.go:117] "RemoveContainer" containerID="50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.666725 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff\": container with ID starting with 50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff not found: ID does not exist" containerID="50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.666756 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff"} err="failed to get container status \"50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff\": rpc error: code = NotFound desc = could not find container \"50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff\": container with ID starting with 50bd2b86b160ace1601a202bec83216e38668fe85a7257a3db3b369e3d812cff not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.666774 5184 scope.go:117] "RemoveContainer" containerID="a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.667215 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403\": container with ID starting with a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403 not found: ID does not exist" containerID="a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.667267 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403"} err="failed to get container status \"a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403\": rpc error: code = NotFound desc = could not find container \"a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403\": container with ID starting with a73baf1fd6a73cb75ad60effffe140705cce6a9aa946be52962b53484bef9403 not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.667299 5184 scope.go:117] "RemoveContainer" containerID="babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.691318 5184 scope.go:117] "RemoveContainer" containerID="babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.693613 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2\": container with ID starting with babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2 not found: ID does not exist" containerID="babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.693649 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2"} err="failed to get container status \"babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2\": rpc error: code = NotFound desc = could not find container \"babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2\": container with ID starting with babc3dcdcf33f05dd5b318ec86df41d4bb1dcc045f74fbf083f61c9eb7d044e2 not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.693674 5184 scope.go:117] "RemoveContainer" containerID="ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.756560 5184 scope.go:117] "RemoveContainer" containerID="3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.788717 5184 scope.go:117] "RemoveContainer" containerID="168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.801153 5184 scope.go:117] "RemoveContainer" containerID="ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.804015 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f\": container with ID starting with ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f not found: ID does not exist" containerID="ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.804052 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f"} err="failed to get container status \"ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f\": rpc error: code = NotFound desc = could not find container \"ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f\": container with ID starting with ddfe97c00cb5848f902cf4e7d2fb994e57751b216cd123458d1422462332ba5f not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.804071 5184 scope.go:117] "RemoveContainer" containerID="3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.804327 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62\": container with ID starting with 3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62 not found: ID does not exist" containerID="3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.804346 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62"} err="failed to get container status \"3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62\": rpc error: code = NotFound desc = could not find container \"3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62\": container with ID starting with 3e373ec6b46dbb4cb1b00313b2f0b81f362d6a1c1233f90bd26bf1afa48bfe62 not found: ID does not exist" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.804358 5184 scope.go:117] "RemoveContainer" containerID="168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5" Mar 12 16:56:02 crc kubenswrapper[5184]: E0312 16:56:02.804622 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5\": container with ID starting with 168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5 not found: ID does not exist" containerID="168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5" Mar 12 16:56:02 crc kubenswrapper[5184]: I0312 16:56:02.804640 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5"} err="failed to get container status \"168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5\": rpc error: code = NotFound desc = could not find container \"168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5\": container with ID starting with 168307e5a7e928c9b05485bc9357e2cce758ea5e6db1d08377851c7d3b6fafd5 not found: ID does not exist" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.479897 5184 generic.go:358] "Generic (PLEG): container finished" podID="b4b9baef-c7b2-4789-a606-0b76f4a575c4" containerID="092c63754a0d67a0781c9e74e1b6fbe0c12d4a215facb90bd7a50cd1b778b0e4" exitCode=0 Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.480011 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555576-8pcks" event={"ID":"b4b9baef-c7b2-4789-a606-0b76f4a575c4","Type":"ContainerDied","Data":"092c63754a0d67a0781c9e74e1b6fbe0c12d4a215facb90bd7a50cd1b778b0e4"} Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.493470 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-hpf6l" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.753649 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2mh4c"] Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754461 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754483 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754501 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45fdd519-1630-4afa-9780-7325691d8206" containerName="extract-utilities" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754509 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fdd519-1630-4afa-9780-7325691d8206" containerName="extract-utilities" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754520 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45fdd519-1630-4afa-9780-7325691d8206" containerName="extract-content" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754528 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fdd519-1630-4afa-9780-7325691d8206" containerName="extract-content" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754543 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce605906-7727-4682-83a5-e18f9faeb789" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754551 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce605906-7727-4682-83a5-e18f9faeb789" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754559 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45fdd519-1630-4afa-9780-7325691d8206" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754566 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fdd519-1630-4afa-9780-7325691d8206" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754576 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27cbd345-0044-49d8-9192-d193df4c579e" containerName="extract-content" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754584 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cbd345-0044-49d8-9192-d193df4c579e" containerName="extract-content" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754596 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerName="extract-utilities" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754603 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerName="extract-utilities" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754616 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce605906-7727-4682-83a5-e18f9faeb789" containerName="extract-utilities" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754624 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce605906-7727-4682-83a5-e18f9faeb789" containerName="extract-utilities" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754637 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27cbd345-0044-49d8-9192-d193df4c579e" containerName="extract-utilities" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754645 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cbd345-0044-49d8-9192-d193df4c579e" containerName="extract-utilities" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754655 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerName="extract-content" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754662 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerName="extract-content" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754680 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce605906-7727-4682-83a5-e18f9faeb789" containerName="extract-content" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754688 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce605906-7727-4682-83a5-e18f9faeb789" containerName="extract-content" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754702 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ad036a8-381e-4761-a20f-8d8b9a3e9408" containerName="marketplace-operator" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754709 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad036a8-381e-4761-a20f-8d8b9a3e9408" containerName="marketplace-operator" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754729 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27cbd345-0044-49d8-9192-d193df4c579e" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754737 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cbd345-0044-49d8-9192-d193df4c579e" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754840 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754853 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce605906-7727-4682-83a5-e18f9faeb789" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754866 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ad036a8-381e-4761-a20f-8d8b9a3e9408" containerName="marketplace-operator" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754877 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="27cbd345-0044-49d8-9192-d193df4c579e" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.754889 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="45fdd519-1630-4afa-9780-7325691d8206" containerName="registry-server" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.760150 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.763722 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.776319 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2mh4c"] Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.858011 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb-catalog-content\") pod \"certified-operators-2mh4c\" (UID: \"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb\") " pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.858352 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb-utilities\") pod \"certified-operators-2mh4c\" (UID: \"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb\") " pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.858547 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sqtv\" (UniqueName: \"kubernetes.io/projected/a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb-kube-api-access-9sqtv\") pod \"certified-operators-2mh4c\" (UID: \"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb\") " pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.949697 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gfb2h"] Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.960202 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb-utilities\") pod \"certified-operators-2mh4c\" (UID: \"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb\") " pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.960290 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sqtv\" (UniqueName: \"kubernetes.io/projected/a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb-kube-api-access-9sqtv\") pod \"certified-operators-2mh4c\" (UID: \"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb\") " pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.960394 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb-catalog-content\") pod \"certified-operators-2mh4c\" (UID: \"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb\") " pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.960885 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb-catalog-content\") pod \"certified-operators-2mh4c\" (UID: \"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb\") " pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.961133 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb-utilities\") pod \"certified-operators-2mh4c\" (UID: \"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb\") " pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:03 crc kubenswrapper[5184]: I0312 16:56:03.982119 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sqtv\" (UniqueName: \"kubernetes.io/projected/a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb-kube-api-access-9sqtv\") pod \"certified-operators-2mh4c\" (UID: \"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb\") " pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.022892 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gfb2h"] Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.023089 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.025887 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.076295 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.163017 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9207421a-3f61-4e7a-be60-40549f1f6c99-catalog-content\") pod \"redhat-marketplace-gfb2h\" (UID: \"9207421a-3f61-4e7a-be60-40549f1f6c99\") " pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.163395 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2c94\" (UniqueName: \"kubernetes.io/projected/9207421a-3f61-4e7a-be60-40549f1f6c99-kube-api-access-r2c94\") pod \"redhat-marketplace-gfb2h\" (UID: \"9207421a-3f61-4e7a-be60-40549f1f6c99\") " pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.163478 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9207421a-3f61-4e7a-be60-40549f1f6c99-utilities\") pod \"redhat-marketplace-gfb2h\" (UID: \"9207421a-3f61-4e7a-be60-40549f1f6c99\") " pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.265040 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9207421a-3f61-4e7a-be60-40549f1f6c99-utilities\") pod \"redhat-marketplace-gfb2h\" (UID: \"9207421a-3f61-4e7a-be60-40549f1f6c99\") " pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.265137 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9207421a-3f61-4e7a-be60-40549f1f6c99-catalog-content\") pod \"redhat-marketplace-gfb2h\" (UID: \"9207421a-3f61-4e7a-be60-40549f1f6c99\") " pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.265178 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2c94\" (UniqueName: \"kubernetes.io/projected/9207421a-3f61-4e7a-be60-40549f1f6c99-kube-api-access-r2c94\") pod \"redhat-marketplace-gfb2h\" (UID: \"9207421a-3f61-4e7a-be60-40549f1f6c99\") " pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.266024 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9207421a-3f61-4e7a-be60-40549f1f6c99-catalog-content\") pod \"redhat-marketplace-gfb2h\" (UID: \"9207421a-3f61-4e7a-be60-40549f1f6c99\") " pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.266486 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9207421a-3f61-4e7a-be60-40549f1f6c99-utilities\") pod \"redhat-marketplace-gfb2h\" (UID: \"9207421a-3f61-4e7a-be60-40549f1f6c99\") " pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.287595 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2c94\" (UniqueName: \"kubernetes.io/projected/9207421a-3f61-4e7a-be60-40549f1f6c99-kube-api-access-r2c94\") pod \"redhat-marketplace-gfb2h\" (UID: \"9207421a-3f61-4e7a-be60-40549f1f6c99\") " pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.349322 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.409764 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27cbd345-0044-49d8-9192-d193df4c579e" path="/var/lib/kubelet/pods/27cbd345-0044-49d8-9192-d193df4c579e/volumes" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.411311 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45fdd519-1630-4afa-9780-7325691d8206" path="/var/lib/kubelet/pods/45fdd519-1630-4afa-9780-7325691d8206/volumes" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.412491 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ad036a8-381e-4761-a20f-8d8b9a3e9408" path="/var/lib/kubelet/pods/5ad036a8-381e-4761-a20f-8d8b9a3e9408/volumes" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.414052 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df70bc7-e513-41bf-94d8-5f79a9d10b64" path="/var/lib/kubelet/pods/8df70bc7-e513-41bf-94d8-5f79a9d10b64/volumes" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.415109 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce605906-7727-4682-83a5-e18f9faeb789" path="/var/lib/kubelet/pods/ce605906-7727-4682-83a5-e18f9faeb789/volumes" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.493368 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2mh4c"] Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.584621 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gfb2h"] Mar 12 16:56:04 crc kubenswrapper[5184]: W0312 16:56:04.601392 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9207421a_3f61_4e7a_be60_40549f1f6c99.slice/crio-265eed33149e292657f40082213bd9481efa12c1c9ded72fad9c4b6b0ccb6259 WatchSource:0}: Error finding container 265eed33149e292657f40082213bd9481efa12c1c9ded72fad9c4b6b0ccb6259: Status 404 returned error can't find the container with id 265eed33149e292657f40082213bd9481efa12c1c9ded72fad9c4b6b0ccb6259 Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.813167 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555576-8pcks" Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.971773 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmk49\" (UniqueName: \"kubernetes.io/projected/b4b9baef-c7b2-4789-a606-0b76f4a575c4-kube-api-access-qmk49\") pod \"b4b9baef-c7b2-4789-a606-0b76f4a575c4\" (UID: \"b4b9baef-c7b2-4789-a606-0b76f4a575c4\") " Mar 12 16:56:04 crc kubenswrapper[5184]: I0312 16:56:04.979810 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b9baef-c7b2-4789-a606-0b76f4a575c4-kube-api-access-qmk49" (OuterVolumeSpecName: "kube-api-access-qmk49") pod "b4b9baef-c7b2-4789-a606-0b76f4a575c4" (UID: "b4b9baef-c7b2-4789-a606-0b76f4a575c4"). InnerVolumeSpecName "kube-api-access-qmk49". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:56:05 crc kubenswrapper[5184]: I0312 16:56:05.073027 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qmk49\" (UniqueName: \"kubernetes.io/projected/b4b9baef-c7b2-4789-a606-0b76f4a575c4-kube-api-access-qmk49\") on node \"crc\" DevicePath \"\"" Mar 12 16:56:05 crc kubenswrapper[5184]: I0312 16:56:05.504254 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555576-8pcks" event={"ID":"b4b9baef-c7b2-4789-a606-0b76f4a575c4","Type":"ContainerDied","Data":"ffb5a1aa0a336b9d13583667fcca3ecb54ed4ad1575d1128143025f203f2e3b8"} Mar 12 16:56:05 crc kubenswrapper[5184]: I0312 16:56:05.504637 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffb5a1aa0a336b9d13583667fcca3ecb54ed4ad1575d1128143025f203f2e3b8" Mar 12 16:56:05 crc kubenswrapper[5184]: I0312 16:56:05.504718 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555576-8pcks" Mar 12 16:56:05 crc kubenswrapper[5184]: I0312 16:56:05.509654 5184 generic.go:358] "Generic (PLEG): container finished" podID="a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb" containerID="1259560c33ffbb521620a1f5521624b88040853bfaafb7a500a97a1ccfedcc1e" exitCode=0 Mar 12 16:56:05 crc kubenswrapper[5184]: I0312 16:56:05.509857 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mh4c" event={"ID":"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb","Type":"ContainerDied","Data":"1259560c33ffbb521620a1f5521624b88040853bfaafb7a500a97a1ccfedcc1e"} Mar 12 16:56:05 crc kubenswrapper[5184]: I0312 16:56:05.509894 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mh4c" event={"ID":"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb","Type":"ContainerStarted","Data":"b85fdeffda87a7190b8397210c6cff226690ae9ebaf220f184d3e7e61c5b881f"} Mar 12 16:56:05 crc kubenswrapper[5184]: I0312 16:56:05.513335 5184 generic.go:358] "Generic (PLEG): container finished" podID="9207421a-3f61-4e7a-be60-40549f1f6c99" containerID="7784b314f6fa56d09f79883788b04ea7574cef9f52fb70a7312d6f05460237d3" exitCode=0 Mar 12 16:56:05 crc kubenswrapper[5184]: I0312 16:56:05.513455 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfb2h" event={"ID":"9207421a-3f61-4e7a-be60-40549f1f6c99","Type":"ContainerDied","Data":"7784b314f6fa56d09f79883788b04ea7574cef9f52fb70a7312d6f05460237d3"} Mar 12 16:56:05 crc kubenswrapper[5184]: I0312 16:56:05.513507 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfb2h" event={"ID":"9207421a-3f61-4e7a-be60-40549f1f6c99","Type":"ContainerStarted","Data":"265eed33149e292657f40082213bd9481efa12c1c9ded72fad9c4b6b0ccb6259"} Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.159331 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m6h45"] Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.161038 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4b9baef-c7b2-4789-a606-0b76f4a575c4" containerName="oc" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.161074 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b9baef-c7b2-4789-a606-0b76f4a575c4" containerName="oc" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.161262 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4b9baef-c7b2-4789-a606-0b76f4a575c4" containerName="oc" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.175478 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.179428 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.192197 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6h45"] Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.294516 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dll7b\" (UniqueName: \"kubernetes.io/projected/43fdd0fb-f097-41a9-9160-0be2f8defa9e-kube-api-access-dll7b\") pod \"redhat-operators-m6h45\" (UID: \"43fdd0fb-f097-41a9-9160-0be2f8defa9e\") " pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.294620 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fdd0fb-f097-41a9-9160-0be2f8defa9e-catalog-content\") pod \"redhat-operators-m6h45\" (UID: \"43fdd0fb-f097-41a9-9160-0be2f8defa9e\") " pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.294702 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fdd0fb-f097-41a9-9160-0be2f8defa9e-utilities\") pod \"redhat-operators-m6h45\" (UID: \"43fdd0fb-f097-41a9-9160-0be2f8defa9e\") " pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.345507 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6zzzp"] Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.357813 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.361429 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6zzzp"] Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.362734 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.397937 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fdd0fb-f097-41a9-9160-0be2f8defa9e-utilities\") pod \"redhat-operators-m6h45\" (UID: \"43fdd0fb-f097-41a9-9160-0be2f8defa9e\") " pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.398801 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fdd0fb-f097-41a9-9160-0be2f8defa9e-utilities\") pod \"redhat-operators-m6h45\" (UID: \"43fdd0fb-f097-41a9-9160-0be2f8defa9e\") " pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.398974 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dll7b\" (UniqueName: \"kubernetes.io/projected/43fdd0fb-f097-41a9-9160-0be2f8defa9e-kube-api-access-dll7b\") pod \"redhat-operators-m6h45\" (UID: \"43fdd0fb-f097-41a9-9160-0be2f8defa9e\") " pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.399530 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fdd0fb-f097-41a9-9160-0be2f8defa9e-catalog-content\") pod \"redhat-operators-m6h45\" (UID: \"43fdd0fb-f097-41a9-9160-0be2f8defa9e\") " pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.399967 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fdd0fb-f097-41a9-9160-0be2f8defa9e-catalog-content\") pod \"redhat-operators-m6h45\" (UID: \"43fdd0fb-f097-41a9-9160-0be2f8defa9e\") " pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.435261 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dll7b\" (UniqueName: \"kubernetes.io/projected/43fdd0fb-f097-41a9-9160-0be2f8defa9e-kube-api-access-dll7b\") pod \"redhat-operators-m6h45\" (UID: \"43fdd0fb-f097-41a9-9160-0be2f8defa9e\") " pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.501640 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7d0b0f3-57cc-443a-85ee-ac686e4f3e52-catalog-content\") pod \"community-operators-6zzzp\" (UID: \"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52\") " pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.501692 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnh8k\" (UniqueName: \"kubernetes.io/projected/b7d0b0f3-57cc-443a-85ee-ac686e4f3e52-kube-api-access-lnh8k\") pod \"community-operators-6zzzp\" (UID: \"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52\") " pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.501717 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7d0b0f3-57cc-443a-85ee-ac686e4f3e52-utilities\") pod \"community-operators-6zzzp\" (UID: \"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52\") " pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.511476 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.519758 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mh4c" event={"ID":"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb","Type":"ContainerStarted","Data":"3ce4dde417a3d28a10545ab47e266e870af184ce23855b65e2aa4bfe0fc82803"} Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.521846 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfb2h" event={"ID":"9207421a-3f61-4e7a-be60-40549f1f6c99","Type":"ContainerStarted","Data":"fda5c2665e3c2723d8fc671d935939ea051a20ace561a494d88585c6a0df1ff0"} Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.605762 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7d0b0f3-57cc-443a-85ee-ac686e4f3e52-catalog-content\") pod \"community-operators-6zzzp\" (UID: \"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52\") " pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.606185 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnh8k\" (UniqueName: \"kubernetes.io/projected/b7d0b0f3-57cc-443a-85ee-ac686e4f3e52-kube-api-access-lnh8k\") pod \"community-operators-6zzzp\" (UID: \"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52\") " pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.606220 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7d0b0f3-57cc-443a-85ee-ac686e4f3e52-utilities\") pod \"community-operators-6zzzp\" (UID: \"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52\") " pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.606908 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7d0b0f3-57cc-443a-85ee-ac686e4f3e52-utilities\") pod \"community-operators-6zzzp\" (UID: \"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52\") " pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.607055 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7d0b0f3-57cc-443a-85ee-ac686e4f3e52-catalog-content\") pod \"community-operators-6zzzp\" (UID: \"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52\") " pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.632876 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnh8k\" (UniqueName: \"kubernetes.io/projected/b7d0b0f3-57cc-443a-85ee-ac686e4f3e52-kube-api-access-lnh8k\") pod \"community-operators-6zzzp\" (UID: \"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52\") " pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.680916 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.868060 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6zzzp"] Mar 12 16:56:06 crc kubenswrapper[5184]: W0312 16:56:06.876965 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7d0b0f3_57cc_443a_85ee_ac686e4f3e52.slice/crio-73614dda1e3e8bbacbc1e80019eb5f1f2b0558ece39044d78ad81607f296f49f WatchSource:0}: Error finding container 73614dda1e3e8bbacbc1e80019eb5f1f2b0558ece39044d78ad81607f296f49f: Status 404 returned error can't find the container with id 73614dda1e3e8bbacbc1e80019eb5f1f2b0558ece39044d78ad81607f296f49f Mar 12 16:56:06 crc kubenswrapper[5184]: I0312 16:56:06.938983 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6h45"] Mar 12 16:56:06 crc kubenswrapper[5184]: W0312 16:56:06.949771 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43fdd0fb_f097_41a9_9160_0be2f8defa9e.slice/crio-59c9964404a1d109e99a4f09c5fa187fcdf3cb308adaaac863a9cb8047888a47 WatchSource:0}: Error finding container 59c9964404a1d109e99a4f09c5fa187fcdf3cb308adaaac863a9cb8047888a47: Status 404 returned error can't find the container with id 59c9964404a1d109e99a4f09c5fa187fcdf3cb308adaaac863a9cb8047888a47 Mar 12 16:56:07 crc kubenswrapper[5184]: I0312 16:56:07.538587 5184 generic.go:358] "Generic (PLEG): container finished" podID="9207421a-3f61-4e7a-be60-40549f1f6c99" containerID="fda5c2665e3c2723d8fc671d935939ea051a20ace561a494d88585c6a0df1ff0" exitCode=0 Mar 12 16:56:07 crc kubenswrapper[5184]: I0312 16:56:07.538643 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfb2h" event={"ID":"9207421a-3f61-4e7a-be60-40549f1f6c99","Type":"ContainerDied","Data":"fda5c2665e3c2723d8fc671d935939ea051a20ace561a494d88585c6a0df1ff0"} Mar 12 16:56:07 crc kubenswrapper[5184]: I0312 16:56:07.540459 5184 generic.go:358] "Generic (PLEG): container finished" podID="b7d0b0f3-57cc-443a-85ee-ac686e4f3e52" containerID="6a3d9c215089796517b5fc5f8e80bdc606ec6922bf5c9e338f0329e2297f01a7" exitCode=0 Mar 12 16:56:07 crc kubenswrapper[5184]: I0312 16:56:07.540573 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zzzp" event={"ID":"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52","Type":"ContainerDied","Data":"6a3d9c215089796517b5fc5f8e80bdc606ec6922bf5c9e338f0329e2297f01a7"} Mar 12 16:56:07 crc kubenswrapper[5184]: I0312 16:56:07.540590 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zzzp" event={"ID":"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52","Type":"ContainerStarted","Data":"73614dda1e3e8bbacbc1e80019eb5f1f2b0558ece39044d78ad81607f296f49f"} Mar 12 16:56:07 crc kubenswrapper[5184]: I0312 16:56:07.545837 5184 generic.go:358] "Generic (PLEG): container finished" podID="43fdd0fb-f097-41a9-9160-0be2f8defa9e" containerID="1900038ae5919f44cd4dd80c215802fc625d0c0870e83080237db740dfeffe2f" exitCode=0 Mar 12 16:56:07 crc kubenswrapper[5184]: I0312 16:56:07.545986 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6h45" event={"ID":"43fdd0fb-f097-41a9-9160-0be2f8defa9e","Type":"ContainerDied","Data":"1900038ae5919f44cd4dd80c215802fc625d0c0870e83080237db740dfeffe2f"} Mar 12 16:56:07 crc kubenswrapper[5184]: I0312 16:56:07.546133 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6h45" event={"ID":"43fdd0fb-f097-41a9-9160-0be2f8defa9e","Type":"ContainerStarted","Data":"59c9964404a1d109e99a4f09c5fa187fcdf3cb308adaaac863a9cb8047888a47"} Mar 12 16:56:07 crc kubenswrapper[5184]: I0312 16:56:07.563567 5184 generic.go:358] "Generic (PLEG): container finished" podID="a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb" containerID="3ce4dde417a3d28a10545ab47e266e870af184ce23855b65e2aa4bfe0fc82803" exitCode=0 Mar 12 16:56:07 crc kubenswrapper[5184]: I0312 16:56:07.563692 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mh4c" event={"ID":"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb","Type":"ContainerDied","Data":"3ce4dde417a3d28a10545ab47e266e870af184ce23855b65e2aa4bfe0fc82803"} Mar 12 16:56:08 crc kubenswrapper[5184]: I0312 16:56:08.570074 5184 generic.go:358] "Generic (PLEG): container finished" podID="b7d0b0f3-57cc-443a-85ee-ac686e4f3e52" containerID="085c7411f7acf78408dc14995e817dac1744f37d886ebdcf717ff8368b35e4de" exitCode=0 Mar 12 16:56:08 crc kubenswrapper[5184]: I0312 16:56:08.570176 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zzzp" event={"ID":"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52","Type":"ContainerDied","Data":"085c7411f7acf78408dc14995e817dac1744f37d886ebdcf717ff8368b35e4de"} Mar 12 16:56:08 crc kubenswrapper[5184]: I0312 16:56:08.573561 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mh4c" event={"ID":"a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb","Type":"ContainerStarted","Data":"c4608ae77f84aa477b2a2db61423186ef77b4c0439e61bcbf3448c0faca56606"} Mar 12 16:56:08 crc kubenswrapper[5184]: I0312 16:56:08.576538 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gfb2h" event={"ID":"9207421a-3f61-4e7a-be60-40549f1f6c99","Type":"ContainerStarted","Data":"90654f9026c696cd40642995f000d41ce721dc7e18d815c8917af2a2fdcb2175"} Mar 12 16:56:08 crc kubenswrapper[5184]: I0312 16:56:08.644288 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2mh4c" podStartSLOduration=4.911576233 podStartE2EDuration="5.644272865s" podCreationTimestamp="2026-03-12 16:56:03 +0000 UTC" firstStartedPulling="2026-03-12 16:56:05.510750461 +0000 UTC m=+308.052061800" lastFinishedPulling="2026-03-12 16:56:06.243447093 +0000 UTC m=+308.784758432" observedRunningTime="2026-03-12 16:56:08.644138781 +0000 UTC m=+311.185450150" watchObservedRunningTime="2026-03-12 16:56:08.644272865 +0000 UTC m=+311.185584204" Mar 12 16:56:08 crc kubenswrapper[5184]: I0312 16:56:08.665762 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gfb2h" podStartSLOduration=4.91490703 podStartE2EDuration="5.6657433s" podCreationTimestamp="2026-03-12 16:56:03 +0000 UTC" firstStartedPulling="2026-03-12 16:56:05.514635396 +0000 UTC m=+308.055946735" lastFinishedPulling="2026-03-12 16:56:06.265471666 +0000 UTC m=+308.806783005" observedRunningTime="2026-03-12 16:56:08.659342642 +0000 UTC m=+311.200653981" watchObservedRunningTime="2026-03-12 16:56:08.6657433 +0000 UTC m=+311.207054649" Mar 12 16:56:09 crc kubenswrapper[5184]: I0312 16:56:09.587958 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6zzzp" event={"ID":"b7d0b0f3-57cc-443a-85ee-ac686e4f3e52","Type":"ContainerStarted","Data":"a38baab5e2b515aeaf000861f299efd5708ad9169c0876c35b30f3a45fac3187"} Mar 12 16:56:09 crc kubenswrapper[5184]: I0312 16:56:09.612016 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6zzzp" podStartSLOduration=3.036373428 podStartE2EDuration="3.611994036s" podCreationTimestamp="2026-03-12 16:56:06 +0000 UTC" firstStartedPulling="2026-03-12 16:56:07.5415107 +0000 UTC m=+310.082822029" lastFinishedPulling="2026-03-12 16:56:08.117131298 +0000 UTC m=+310.658442637" observedRunningTime="2026-03-12 16:56:09.606745176 +0000 UTC m=+312.148056555" watchObservedRunningTime="2026-03-12 16:56:09.611994036 +0000 UTC m=+312.153305405" Mar 12 16:56:11 crc kubenswrapper[5184]: I0312 16:56:11.607019 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6h45" event={"ID":"43fdd0fb-f097-41a9-9160-0be2f8defa9e","Type":"ContainerStarted","Data":"45b8a5f3e7e845ad38327fe02c75f23c0ea61ecc18223e03ff4ea9736dce7e5b"} Mar 12 16:56:12 crc kubenswrapper[5184]: I0312 16:56:12.615467 5184 generic.go:358] "Generic (PLEG): container finished" podID="43fdd0fb-f097-41a9-9160-0be2f8defa9e" containerID="45b8a5f3e7e845ad38327fe02c75f23c0ea61ecc18223e03ff4ea9736dce7e5b" exitCode=0 Mar 12 16:56:12 crc kubenswrapper[5184]: I0312 16:56:12.615546 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6h45" event={"ID":"43fdd0fb-f097-41a9-9160-0be2f8defa9e","Type":"ContainerDied","Data":"45b8a5f3e7e845ad38327fe02c75f23c0ea61ecc18223e03ff4ea9736dce7e5b"} Mar 12 16:56:13 crc kubenswrapper[5184]: I0312 16:56:13.957780 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6h45" event={"ID":"43fdd0fb-f097-41a9-9160-0be2f8defa9e","Type":"ContainerStarted","Data":"e24fe56e89c1d907081fb23a225a5d4f6217141f9ab999367d9e4df10f875392"} Mar 12 16:56:14 crc kubenswrapper[5184]: I0312 16:56:14.076732 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:14 crc kubenswrapper[5184]: I0312 16:56:14.076778 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:14 crc kubenswrapper[5184]: I0312 16:56:14.122572 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:14 crc kubenswrapper[5184]: I0312 16:56:14.139646 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m6h45" podStartSLOduration=4.936280515 podStartE2EDuration="8.139626349s" podCreationTimestamp="2026-03-12 16:56:06 +0000 UTC" firstStartedPulling="2026-03-12 16:56:07.555827434 +0000 UTC m=+310.097138783" lastFinishedPulling="2026-03-12 16:56:10.759173278 +0000 UTC m=+313.300484617" observedRunningTime="2026-03-12 16:56:13.977246054 +0000 UTC m=+316.518557393" watchObservedRunningTime="2026-03-12 16:56:14.139626349 +0000 UTC m=+316.680937688" Mar 12 16:56:14 crc kubenswrapper[5184]: I0312 16:56:14.349944 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:14 crc kubenswrapper[5184]: I0312 16:56:14.350156 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:14 crc kubenswrapper[5184]: I0312 16:56:14.390650 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:15 crc kubenswrapper[5184]: I0312 16:56:15.015065 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gfb2h" Mar 12 16:56:15 crc kubenswrapper[5184]: I0312 16:56:15.018526 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2mh4c" Mar 12 16:56:16 crc kubenswrapper[5184]: I0312 16:56:16.511865 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:16 crc kubenswrapper[5184]: I0312 16:56:16.512419 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:16 crc kubenswrapper[5184]: I0312 16:56:16.682444 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:16 crc kubenswrapper[5184]: I0312 16:56:16.682494 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:16 crc kubenswrapper[5184]: I0312 16:56:16.726644 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:17 crc kubenswrapper[5184]: I0312 16:56:17.011055 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6zzzp" Mar 12 16:56:17 crc kubenswrapper[5184]: I0312 16:56:17.555416 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m6h45" podUID="43fdd0fb-f097-41a9-9160-0be2f8defa9e" containerName="registry-server" probeResult="failure" output=< Mar 12 16:56:17 crc kubenswrapper[5184]: timeout: failed to connect service ":50051" within 1s Mar 12 16:56:17 crc kubenswrapper[5184]: > Mar 12 16:56:26 crc kubenswrapper[5184]: I0312 16:56:26.573070 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:56:26 crc kubenswrapper[5184]: I0312 16:56:26.626720 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m6h45" Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.128812 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555578-7hxkg"] Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.140640 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555578-7hxkg"] Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.140740 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555578-7hxkg" Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.143871 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.144285 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.144442 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.239612 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcwd7\" (UniqueName: \"kubernetes.io/projected/4eba40a8-87c3-43ee-88bf-502b11e90d37-kube-api-access-hcwd7\") pod \"auto-csr-approver-29555578-7hxkg\" (UID: \"4eba40a8-87c3-43ee-88bf-502b11e90d37\") " pod="openshift-infra/auto-csr-approver-29555578-7hxkg" Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.341137 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcwd7\" (UniqueName: \"kubernetes.io/projected/4eba40a8-87c3-43ee-88bf-502b11e90d37-kube-api-access-hcwd7\") pod \"auto-csr-approver-29555578-7hxkg\" (UID: \"4eba40a8-87c3-43ee-88bf-502b11e90d37\") " pod="openshift-infra/auto-csr-approver-29555578-7hxkg" Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.366601 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcwd7\" (UniqueName: \"kubernetes.io/projected/4eba40a8-87c3-43ee-88bf-502b11e90d37-kube-api-access-hcwd7\") pod \"auto-csr-approver-29555578-7hxkg\" (UID: \"4eba40a8-87c3-43ee-88bf-502b11e90d37\") " pod="openshift-infra/auto-csr-approver-29555578-7hxkg" Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.458431 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555578-7hxkg" Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.688663 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555578-7hxkg"] Mar 12 16:58:00 crc kubenswrapper[5184]: W0312 16:58:00.693851 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eba40a8_87c3_43ee_88bf_502b11e90d37.slice/crio-f84aa0cb2a11426bcfb3b6712b627110cc0c63953b3ef8cf26fe7eafb7a2f90a WatchSource:0}: Error finding container f84aa0cb2a11426bcfb3b6712b627110cc0c63953b3ef8cf26fe7eafb7a2f90a: Status 404 returned error can't find the container with id f84aa0cb2a11426bcfb3b6712b627110cc0c63953b3ef8cf26fe7eafb7a2f90a Mar 12 16:58:00 crc kubenswrapper[5184]: I0312 16:58:00.728252 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555578-7hxkg" event={"ID":"4eba40a8-87c3-43ee-88bf-502b11e90d37","Type":"ContainerStarted","Data":"f84aa0cb2a11426bcfb3b6712b627110cc0c63953b3ef8cf26fe7eafb7a2f90a"} Mar 12 16:58:02 crc kubenswrapper[5184]: I0312 16:58:02.744569 5184 generic.go:358] "Generic (PLEG): container finished" podID="4eba40a8-87c3-43ee-88bf-502b11e90d37" containerID="47e8322adaabbd88b5655e48f068f7736ae022676f8e04bbd22876b7c5c1ce5b" exitCode=0 Mar 12 16:58:02 crc kubenswrapper[5184]: I0312 16:58:02.744612 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555578-7hxkg" event={"ID":"4eba40a8-87c3-43ee-88bf-502b11e90d37","Type":"ContainerDied","Data":"47e8322adaabbd88b5655e48f068f7736ae022676f8e04bbd22876b7c5c1ce5b"} Mar 12 16:58:04 crc kubenswrapper[5184]: I0312 16:58:04.034784 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555578-7hxkg" Mar 12 16:58:04 crc kubenswrapper[5184]: I0312 16:58:04.101035 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcwd7\" (UniqueName: \"kubernetes.io/projected/4eba40a8-87c3-43ee-88bf-502b11e90d37-kube-api-access-hcwd7\") pod \"4eba40a8-87c3-43ee-88bf-502b11e90d37\" (UID: \"4eba40a8-87c3-43ee-88bf-502b11e90d37\") " Mar 12 16:58:04 crc kubenswrapper[5184]: I0312 16:58:04.107211 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eba40a8-87c3-43ee-88bf-502b11e90d37-kube-api-access-hcwd7" (OuterVolumeSpecName: "kube-api-access-hcwd7") pod "4eba40a8-87c3-43ee-88bf-502b11e90d37" (UID: "4eba40a8-87c3-43ee-88bf-502b11e90d37"). InnerVolumeSpecName "kube-api-access-hcwd7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 16:58:04 crc kubenswrapper[5184]: I0312 16:58:04.202343 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hcwd7\" (UniqueName: \"kubernetes.io/projected/4eba40a8-87c3-43ee-88bf-502b11e90d37-kube-api-access-hcwd7\") on node \"crc\" DevicePath \"\"" Mar 12 16:58:04 crc kubenswrapper[5184]: I0312 16:58:04.761102 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555578-7hxkg" event={"ID":"4eba40a8-87c3-43ee-88bf-502b11e90d37","Type":"ContainerDied","Data":"f84aa0cb2a11426bcfb3b6712b627110cc0c63953b3ef8cf26fe7eafb7a2f90a"} Mar 12 16:58:04 crc kubenswrapper[5184]: I0312 16:58:04.761138 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555578-7hxkg" Mar 12 16:58:04 crc kubenswrapper[5184]: I0312 16:58:04.761156 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f84aa0cb2a11426bcfb3b6712b627110cc0c63953b3ef8cf26fe7eafb7a2f90a" Mar 12 16:58:20 crc kubenswrapper[5184]: I0312 16:58:20.743372 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:58:20 crc kubenswrapper[5184]: I0312 16:58:20.744034 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:58:50 crc kubenswrapper[5184]: I0312 16:58:50.742828 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:58:50 crc kubenswrapper[5184]: I0312 16:58:50.743665 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:59:20 crc kubenswrapper[5184]: I0312 16:59:20.742538 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 16:59:20 crc kubenswrapper[5184]: I0312 16:59:20.743250 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 16:59:20 crc kubenswrapper[5184]: I0312 16:59:20.743352 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 16:59:20 crc kubenswrapper[5184]: I0312 16:59:20.744347 5184 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ce931eac957036c6c965318bd6ebe196835262d045725f0735bb9f9799bfd42"} pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 16:59:20 crc kubenswrapper[5184]: I0312 16:59:20.744508 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" containerID="cri-o://7ce931eac957036c6c965318bd6ebe196835262d045725f0735bb9f9799bfd42" gracePeriod=600 Mar 12 16:59:21 crc kubenswrapper[5184]: I0312 16:59:21.290366 5184 generic.go:358] "Generic (PLEG): container finished" podID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerID="7ce931eac957036c6c965318bd6ebe196835262d045725f0735bb9f9799bfd42" exitCode=0 Mar 12 16:59:21 crc kubenswrapper[5184]: I0312 16:59:21.290433 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerDied","Data":"7ce931eac957036c6c965318bd6ebe196835262d045725f0735bb9f9799bfd42"} Mar 12 16:59:21 crc kubenswrapper[5184]: I0312 16:59:21.291015 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"6dcddf4c82a491a243d037b62a542200cd43f90af290d25abaab07cac5e2a61e"} Mar 12 16:59:21 crc kubenswrapper[5184]: I0312 16:59:21.291057 5184 scope.go:117] "RemoveContainer" containerID="a794500127db524b745f6dfb40cb4c4c83a065628e7edf1a8c68e379958a7834" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.144503 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555580-966mx"] Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.146067 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4eba40a8-87c3-43ee-88bf-502b11e90d37" containerName="oc" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.146089 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eba40a8-87c3-43ee-88bf-502b11e90d37" containerName="oc" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.146229 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="4eba40a8-87c3-43ee-88bf-502b11e90d37" containerName="oc" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.158030 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm"] Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.158593 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555580-966mx" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.161476 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.161636 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.161622 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.161961 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555580-966mx"] Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.162021 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.164102 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.164211 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.184848 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm"] Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.213235 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/901d21c1-a7c3-4a8a-8c75-2724992dd87b-config-volume\") pod \"collect-profiles-29555580-lmckm\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.213280 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt697\" (UniqueName: \"kubernetes.io/projected/4a087d4e-ec76-454a-8d5f-74144f387a03-kube-api-access-lt697\") pod \"auto-csr-approver-29555580-966mx\" (UID: \"4a087d4e-ec76-454a-8d5f-74144f387a03\") " pod="openshift-infra/auto-csr-approver-29555580-966mx" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.213297 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/901d21c1-a7c3-4a8a-8c75-2724992dd87b-secret-volume\") pod \"collect-profiles-29555580-lmckm\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.213343 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhgtv\" (UniqueName: \"kubernetes.io/projected/901d21c1-a7c3-4a8a-8c75-2724992dd87b-kube-api-access-zhgtv\") pod \"collect-profiles-29555580-lmckm\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.314759 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhgtv\" (UniqueName: \"kubernetes.io/projected/901d21c1-a7c3-4a8a-8c75-2724992dd87b-kube-api-access-zhgtv\") pod \"collect-profiles-29555580-lmckm\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.314963 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/901d21c1-a7c3-4a8a-8c75-2724992dd87b-config-volume\") pod \"collect-profiles-29555580-lmckm\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.315027 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt697\" (UniqueName: \"kubernetes.io/projected/4a087d4e-ec76-454a-8d5f-74144f387a03-kube-api-access-lt697\") pod \"auto-csr-approver-29555580-966mx\" (UID: \"4a087d4e-ec76-454a-8d5f-74144f387a03\") " pod="openshift-infra/auto-csr-approver-29555580-966mx" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.315198 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/901d21c1-a7c3-4a8a-8c75-2724992dd87b-secret-volume\") pod \"collect-profiles-29555580-lmckm\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.316708 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/901d21c1-a7c3-4a8a-8c75-2724992dd87b-config-volume\") pod \"collect-profiles-29555580-lmckm\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.323273 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/901d21c1-a7c3-4a8a-8c75-2724992dd87b-secret-volume\") pod \"collect-profiles-29555580-lmckm\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.334555 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt697\" (UniqueName: \"kubernetes.io/projected/4a087d4e-ec76-454a-8d5f-74144f387a03-kube-api-access-lt697\") pod \"auto-csr-approver-29555580-966mx\" (UID: \"4a087d4e-ec76-454a-8d5f-74144f387a03\") " pod="openshift-infra/auto-csr-approver-29555580-966mx" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.342352 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhgtv\" (UniqueName: \"kubernetes.io/projected/901d21c1-a7c3-4a8a-8c75-2724992dd87b-kube-api-access-zhgtv\") pod \"collect-profiles-29555580-lmckm\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.478404 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555580-966mx" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.489621 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.713486 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm"] Mar 12 17:00:00 crc kubenswrapper[5184]: I0312 17:00:00.921345 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555580-966mx"] Mar 12 17:00:01 crc kubenswrapper[5184]: I0312 17:00:01.563832 5184 generic.go:358] "Generic (PLEG): container finished" podID="901d21c1-a7c3-4a8a-8c75-2724992dd87b" containerID="2d9a6338b8eaac62ca5081e86fbdbff62cd4162277428dd48d25e02af1f72a38" exitCode=0 Mar 12 17:00:01 crc kubenswrapper[5184]: I0312 17:00:01.563960 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" event={"ID":"901d21c1-a7c3-4a8a-8c75-2724992dd87b","Type":"ContainerDied","Data":"2d9a6338b8eaac62ca5081e86fbdbff62cd4162277428dd48d25e02af1f72a38"} Mar 12 17:00:01 crc kubenswrapper[5184]: I0312 17:00:01.564544 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" event={"ID":"901d21c1-a7c3-4a8a-8c75-2724992dd87b","Type":"ContainerStarted","Data":"74389a50c407325427c44b4b8c5e041eeb2791dca340607ea297ca5312711e82"} Mar 12 17:00:01 crc kubenswrapper[5184]: I0312 17:00:01.566421 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555580-966mx" event={"ID":"4a087d4e-ec76-454a-8d5f-74144f387a03","Type":"ContainerStarted","Data":"60c077b670ff5dc50f86d8957f1a3b5211b8bfc9aeb2f018284d593c603ba6b1"} Mar 12 17:00:02 crc kubenswrapper[5184]: I0312 17:00:02.836169 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:02 crc kubenswrapper[5184]: I0312 17:00:02.948533 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/901d21c1-a7c3-4a8a-8c75-2724992dd87b-config-volume\") pod \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " Mar 12 17:00:02 crc kubenswrapper[5184]: I0312 17:00:02.948773 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhgtv\" (UniqueName: \"kubernetes.io/projected/901d21c1-a7c3-4a8a-8c75-2724992dd87b-kube-api-access-zhgtv\") pod \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " Mar 12 17:00:02 crc kubenswrapper[5184]: I0312 17:00:02.948828 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/901d21c1-a7c3-4a8a-8c75-2724992dd87b-secret-volume\") pod \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\" (UID: \"901d21c1-a7c3-4a8a-8c75-2724992dd87b\") " Mar 12 17:00:02 crc kubenswrapper[5184]: I0312 17:00:02.949431 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/901d21c1-a7c3-4a8a-8c75-2724992dd87b-config-volume" (OuterVolumeSpecName: "config-volume") pod "901d21c1-a7c3-4a8a-8c75-2724992dd87b" (UID: "901d21c1-a7c3-4a8a-8c75-2724992dd87b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:00:02 crc kubenswrapper[5184]: I0312 17:00:02.958821 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901d21c1-a7c3-4a8a-8c75-2724992dd87b-kube-api-access-zhgtv" (OuterVolumeSpecName: "kube-api-access-zhgtv") pod "901d21c1-a7c3-4a8a-8c75-2724992dd87b" (UID: "901d21c1-a7c3-4a8a-8c75-2724992dd87b"). InnerVolumeSpecName "kube-api-access-zhgtv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:00:02 crc kubenswrapper[5184]: I0312 17:00:02.959645 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901d21c1-a7c3-4a8a-8c75-2724992dd87b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "901d21c1-a7c3-4a8a-8c75-2724992dd87b" (UID: "901d21c1-a7c3-4a8a-8c75-2724992dd87b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:00:03 crc kubenswrapper[5184]: I0312 17:00:03.050159 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zhgtv\" (UniqueName: \"kubernetes.io/projected/901d21c1-a7c3-4a8a-8c75-2724992dd87b-kube-api-access-zhgtv\") on node \"crc\" DevicePath \"\"" Mar 12 17:00:03 crc kubenswrapper[5184]: I0312 17:00:03.050212 5184 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/901d21c1-a7c3-4a8a-8c75-2724992dd87b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 17:00:03 crc kubenswrapper[5184]: I0312 17:00:03.050232 5184 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/901d21c1-a7c3-4a8a-8c75-2724992dd87b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 17:00:03 crc kubenswrapper[5184]: I0312 17:00:03.582981 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" Mar 12 17:00:03 crc kubenswrapper[5184]: I0312 17:00:03.583039 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555580-lmckm" event={"ID":"901d21c1-a7c3-4a8a-8c75-2724992dd87b","Type":"ContainerDied","Data":"74389a50c407325427c44b4b8c5e041eeb2791dca340607ea297ca5312711e82"} Mar 12 17:00:03 crc kubenswrapper[5184]: I0312 17:00:03.583772 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74389a50c407325427c44b4b8c5e041eeb2791dca340607ea297ca5312711e82" Mar 12 17:00:04 crc kubenswrapper[5184]: I0312 17:00:04.591022 5184 generic.go:358] "Generic (PLEG): container finished" podID="4a087d4e-ec76-454a-8d5f-74144f387a03" containerID="243558ef2a97890d490b80f1334c16b699081b563491e7f2f132f0374e30649c" exitCode=0 Mar 12 17:00:04 crc kubenswrapper[5184]: I0312 17:00:04.591160 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555580-966mx" event={"ID":"4a087d4e-ec76-454a-8d5f-74144f387a03","Type":"ContainerDied","Data":"243558ef2a97890d490b80f1334c16b699081b563491e7f2f132f0374e30649c"} Mar 12 17:00:05 crc kubenswrapper[5184]: I0312 17:00:05.883558 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555580-966mx" Mar 12 17:00:05 crc kubenswrapper[5184]: I0312 17:00:05.990207 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt697\" (UniqueName: \"kubernetes.io/projected/4a087d4e-ec76-454a-8d5f-74144f387a03-kube-api-access-lt697\") pod \"4a087d4e-ec76-454a-8d5f-74144f387a03\" (UID: \"4a087d4e-ec76-454a-8d5f-74144f387a03\") " Mar 12 17:00:05 crc kubenswrapper[5184]: I0312 17:00:05.998279 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a087d4e-ec76-454a-8d5f-74144f387a03-kube-api-access-lt697" (OuterVolumeSpecName: "kube-api-access-lt697") pod "4a087d4e-ec76-454a-8d5f-74144f387a03" (UID: "4a087d4e-ec76-454a-8d5f-74144f387a03"). InnerVolumeSpecName "kube-api-access-lt697". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:00:06 crc kubenswrapper[5184]: I0312 17:00:06.091620 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lt697\" (UniqueName: \"kubernetes.io/projected/4a087d4e-ec76-454a-8d5f-74144f387a03-kube-api-access-lt697\") on node \"crc\" DevicePath \"\"" Mar 12 17:00:06 crc kubenswrapper[5184]: I0312 17:00:06.611799 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555580-966mx" event={"ID":"4a087d4e-ec76-454a-8d5f-74144f387a03","Type":"ContainerDied","Data":"60c077b670ff5dc50f86d8957f1a3b5211b8bfc9aeb2f018284d593c603ba6b1"} Mar 12 17:00:06 crc kubenswrapper[5184]: I0312 17:00:06.611872 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60c077b670ff5dc50f86d8957f1a3b5211b8bfc9aeb2f018284d593c603ba6b1" Mar 12 17:00:06 crc kubenswrapper[5184]: I0312 17:00:06.611970 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555580-966mx" Mar 12 17:00:06 crc kubenswrapper[5184]: I0312 17:00:06.977181 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555574-n2v5k"] Mar 12 17:00:06 crc kubenswrapper[5184]: I0312 17:00:06.986073 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555574-n2v5k"] Mar 12 17:00:08 crc kubenswrapper[5184]: I0312 17:00:08.412611 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6301ddeb-2d36-4015-b622-c1fc9acaeac4" path="/var/lib/kubelet/pods/6301ddeb-2d36-4015-b622-c1fc9acaeac4/volumes" Mar 12 17:00:58 crc kubenswrapper[5184]: I0312 17:00:58.680080 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:00:58 crc kubenswrapper[5184]: I0312 17:00:58.684910 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:00:58 crc kubenswrapper[5184]: I0312 17:00:58.718959 5184 scope.go:117] "RemoveContainer" containerID="a1457d37fd42b339caff33900bb0fc56e005536745fa570b93a0435c8f9b4f8b" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.797276 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-47lwd"] Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.798212 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a087d4e-ec76-454a-8d5f-74144f387a03" containerName="oc" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.798225 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a087d4e-ec76-454a-8d5f-74144f387a03" containerName="oc" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.798252 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="901d21c1-a7c3-4a8a-8c75-2724992dd87b" containerName="collect-profiles" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.798258 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="901d21c1-a7c3-4a8a-8c75-2724992dd87b" containerName="collect-profiles" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.798360 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="901d21c1-a7c3-4a8a-8c75-2724992dd87b" containerName="collect-profiles" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.798384 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a087d4e-ec76-454a-8d5f-74144f387a03" containerName="oc" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.801295 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.837802 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-47lwd"] Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.940343 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c987b06-e4de-4f80-9540-a8598ca5eed1-registry-certificates\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.940406 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c987b06-e4de-4f80-9540-a8598ca5eed1-registry-tls\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.940431 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.940605 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c987b06-e4de-4f80-9540-a8598ca5eed1-bound-sa-token\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.940646 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c987b06-e4de-4f80-9540-a8598ca5eed1-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.940693 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxkdg\" (UniqueName: \"kubernetes.io/projected/7c987b06-e4de-4f80-9540-a8598ca5eed1-kube-api-access-vxkdg\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.940752 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c987b06-e4de-4f80-9540-a8598ca5eed1-trusted-ca\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.940781 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c987b06-e4de-4f80-9540-a8598ca5eed1-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:26 crc kubenswrapper[5184]: I0312 17:01:26.968124 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.042673 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxkdg\" (UniqueName: \"kubernetes.io/projected/7c987b06-e4de-4f80-9540-a8598ca5eed1-kube-api-access-vxkdg\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.042801 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c987b06-e4de-4f80-9540-a8598ca5eed1-trusted-ca\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.042837 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c987b06-e4de-4f80-9540-a8598ca5eed1-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.042927 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c987b06-e4de-4f80-9540-a8598ca5eed1-registry-certificates\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.042976 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c987b06-e4de-4f80-9540-a8598ca5eed1-registry-tls\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.043042 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c987b06-e4de-4f80-9540-a8598ca5eed1-bound-sa-token\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.043263 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c987b06-e4de-4f80-9540-a8598ca5eed1-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.043742 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c987b06-e4de-4f80-9540-a8598ca5eed1-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.043939 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c987b06-e4de-4f80-9540-a8598ca5eed1-trusted-ca\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.044478 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c987b06-e4de-4f80-9540-a8598ca5eed1-registry-certificates\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.050293 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c987b06-e4de-4f80-9540-a8598ca5eed1-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.050410 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c987b06-e4de-4f80-9540-a8598ca5eed1-registry-tls\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.063757 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c987b06-e4de-4f80-9540-a8598ca5eed1-bound-sa-token\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.065503 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxkdg\" (UniqueName: \"kubernetes.io/projected/7c987b06-e4de-4f80-9540-a8598ca5eed1-kube-api-access-vxkdg\") pod \"image-registry-5d9d95bf5b-47lwd\" (UID: \"7c987b06-e4de-4f80-9540-a8598ca5eed1\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.116638 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.371426 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt"] Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.386116 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.389353 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.389418 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-hlwft\"" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.393115 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.393747 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-47lwd"] Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.397435 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7b8b89f89d-wb6fn"] Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.402814 5184 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.403294 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7b8b89f89d-wb6fn" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.409985 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-mvlg5\"" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.416592 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt"] Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.424365 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7b8b89f89d-wb6fn"] Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.430467 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-769f6b94cb-n4sng"] Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.437272 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-769f6b94cb-n4sng" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.439733 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-769f6b94cb-n4sng"] Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.440946 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-gcwbl\"" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.464138 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6l6k\" (UniqueName: \"kubernetes.io/projected/0bcd5938-7f0c-4d2d-8f96-ef9933012381-kube-api-access-p6l6k\") pod \"cert-manager-cainjector-7f9fdd5dd5-hmzwt\" (UID: \"0bcd5938-7f0c-4d2d-8f96-ef9933012381\") " pod="cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.464202 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lv2r\" (UniqueName: \"kubernetes.io/projected/f433a5f6-b857-462d-9896-fcbf044da648-kube-api-access-2lv2r\") pod \"cert-manager-webhook-769f6b94cb-n4sng\" (UID: \"f433a5f6-b857-462d-9896-fcbf044da648\") " pod="cert-manager/cert-manager-webhook-769f6b94cb-n4sng" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.464257 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2pv7\" (UniqueName: \"kubernetes.io/projected/c25e7c99-ce20-4a74-84fc-fb24c01c931b-kube-api-access-c2pv7\") pod \"cert-manager-7b8b89f89d-wb6fn\" (UID: \"c25e7c99-ce20-4a74-84fc-fb24c01c931b\") " pod="cert-manager/cert-manager-7b8b89f89d-wb6fn" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.565880 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6l6k\" (UniqueName: \"kubernetes.io/projected/0bcd5938-7f0c-4d2d-8f96-ef9933012381-kube-api-access-p6l6k\") pod \"cert-manager-cainjector-7f9fdd5dd5-hmzwt\" (UID: \"0bcd5938-7f0c-4d2d-8f96-ef9933012381\") " pod="cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.565945 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lv2r\" (UniqueName: \"kubernetes.io/projected/f433a5f6-b857-462d-9896-fcbf044da648-kube-api-access-2lv2r\") pod \"cert-manager-webhook-769f6b94cb-n4sng\" (UID: \"f433a5f6-b857-462d-9896-fcbf044da648\") " pod="cert-manager/cert-manager-webhook-769f6b94cb-n4sng" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.565981 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2pv7\" (UniqueName: \"kubernetes.io/projected/c25e7c99-ce20-4a74-84fc-fb24c01c931b-kube-api-access-c2pv7\") pod \"cert-manager-7b8b89f89d-wb6fn\" (UID: \"c25e7c99-ce20-4a74-84fc-fb24c01c931b\") " pod="cert-manager/cert-manager-7b8b89f89d-wb6fn" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.584980 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6l6k\" (UniqueName: \"kubernetes.io/projected/0bcd5938-7f0c-4d2d-8f96-ef9933012381-kube-api-access-p6l6k\") pod \"cert-manager-cainjector-7f9fdd5dd5-hmzwt\" (UID: \"0bcd5938-7f0c-4d2d-8f96-ef9933012381\") " pod="cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.586572 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lv2r\" (UniqueName: \"kubernetes.io/projected/f433a5f6-b857-462d-9896-fcbf044da648-kube-api-access-2lv2r\") pod \"cert-manager-webhook-769f6b94cb-n4sng\" (UID: \"f433a5f6-b857-462d-9896-fcbf044da648\") " pod="cert-manager/cert-manager-webhook-769f6b94cb-n4sng" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.586886 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2pv7\" (UniqueName: \"kubernetes.io/projected/c25e7c99-ce20-4a74-84fc-fb24c01c931b-kube-api-access-c2pv7\") pod \"cert-manager-7b8b89f89d-wb6fn\" (UID: \"c25e7c99-ce20-4a74-84fc-fb24c01c931b\") " pod="cert-manager/cert-manager-7b8b89f89d-wb6fn" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.705508 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.724957 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7b8b89f89d-wb6fn" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.753660 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-769f6b94cb-n4sng" Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.919031 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt"] Mar 12 17:01:27 crc kubenswrapper[5184]: W0312 17:01:27.924406 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bcd5938_7f0c_4d2d_8f96_ef9933012381.slice/crio-5cde12cec6555c31df84b2368e778eaf6719212725ead1d03b41f5704dfc971e WatchSource:0}: Error finding container 5cde12cec6555c31df84b2368e778eaf6719212725ead1d03b41f5704dfc971e: Status 404 returned error can't find the container with id 5cde12cec6555c31df84b2368e778eaf6719212725ead1d03b41f5704dfc971e Mar 12 17:01:27 crc kubenswrapper[5184]: I0312 17:01:27.962142 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7b8b89f89d-wb6fn"] Mar 12 17:01:27 crc kubenswrapper[5184]: W0312 17:01:27.974364 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25e7c99_ce20_4a74_84fc_fb24c01c931b.slice/crio-c85f1aab70ea033354b1bccd745599c494afa51b138a36aa200a0882d7e7c7fb WatchSource:0}: Error finding container c85f1aab70ea033354b1bccd745599c494afa51b138a36aa200a0882d7e7c7fb: Status 404 returned error can't find the container with id c85f1aab70ea033354b1bccd745599c494afa51b138a36aa200a0882d7e7c7fb Mar 12 17:01:28 crc kubenswrapper[5184]: I0312 17:01:28.000550 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-769f6b94cb-n4sng"] Mar 12 17:01:28 crc kubenswrapper[5184]: W0312 17:01:28.009132 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf433a5f6_b857_462d_9896_fcbf044da648.slice/crio-6ab18c55927de2551ac30bcc9e52965c86422b6234b84363fd41112665bd9e04 WatchSource:0}: Error finding container 6ab18c55927de2551ac30bcc9e52965c86422b6234b84363fd41112665bd9e04: Status 404 returned error can't find the container with id 6ab18c55927de2551ac30bcc9e52965c86422b6234b84363fd41112665bd9e04 Mar 12 17:01:28 crc kubenswrapper[5184]: I0312 17:01:28.186897 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7b8b89f89d-wb6fn" event={"ID":"c25e7c99-ce20-4a74-84fc-fb24c01c931b","Type":"ContainerStarted","Data":"c85f1aab70ea033354b1bccd745599c494afa51b138a36aa200a0882d7e7c7fb"} Mar 12 17:01:28 crc kubenswrapper[5184]: I0312 17:01:28.188694 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" event={"ID":"7c987b06-e4de-4f80-9540-a8598ca5eed1","Type":"ContainerStarted","Data":"21a5ce31929f6095a00b7cd8c0ac9f789fb45e6b545fe8ef2b24232e6e58b295"} Mar 12 17:01:28 crc kubenswrapper[5184]: I0312 17:01:28.188750 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" event={"ID":"7c987b06-e4de-4f80-9540-a8598ca5eed1","Type":"ContainerStarted","Data":"31cd00e16ae66dfd198c18f605b3c9deb6c8f304f4eb236c8c40aad24ba3fb6e"} Mar 12 17:01:28 crc kubenswrapper[5184]: I0312 17:01:28.190338 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt" event={"ID":"0bcd5938-7f0c-4d2d-8f96-ef9933012381","Type":"ContainerStarted","Data":"5cde12cec6555c31df84b2368e778eaf6719212725ead1d03b41f5704dfc971e"} Mar 12 17:01:28 crc kubenswrapper[5184]: I0312 17:01:28.191645 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-769f6b94cb-n4sng" event={"ID":"f433a5f6-b857-462d-9896-fcbf044da648","Type":"ContainerStarted","Data":"6ab18c55927de2551ac30bcc9e52965c86422b6234b84363fd41112665bd9e04"} Mar 12 17:01:28 crc kubenswrapper[5184]: I0312 17:01:28.209823 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" podStartSLOduration=2.209800884 podStartE2EDuration="2.209800884s" podCreationTimestamp="2026-03-12 17:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:01:28.207941874 +0000 UTC m=+630.749253223" watchObservedRunningTime="2026-03-12 17:01:28.209800884 +0000 UTC m=+630.751112223" Mar 12 17:01:29 crc kubenswrapper[5184]: I0312 17:01:29.197251 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:31 crc kubenswrapper[5184]: I0312 17:01:31.207988 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7b8b89f89d-wb6fn" event={"ID":"c25e7c99-ce20-4a74-84fc-fb24c01c931b","Type":"ContainerStarted","Data":"2d1199a8d47b776c6318c54b18c3ffeef4bfb9a70293556e21d2150311d13e9b"} Mar 12 17:01:31 crc kubenswrapper[5184]: I0312 17:01:31.210719 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt" event={"ID":"0bcd5938-7f0c-4d2d-8f96-ef9933012381","Type":"ContainerStarted","Data":"86c39fac0ea12e7251a968f78e8a92f40bab443a0e05952c404499536ef109c9"} Mar 12 17:01:31 crc kubenswrapper[5184]: I0312 17:01:31.229108 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7b8b89f89d-wb6fn" podStartSLOduration=1.3246386829999999 podStartE2EDuration="4.229086186s" podCreationTimestamp="2026-03-12 17:01:27 +0000 UTC" firstStartedPulling="2026-03-12 17:01:27.977756769 +0000 UTC m=+630.519068108" lastFinishedPulling="2026-03-12 17:01:30.882204242 +0000 UTC m=+633.423515611" observedRunningTime="2026-03-12 17:01:31.225832222 +0000 UTC m=+633.767143641" watchObservedRunningTime="2026-03-12 17:01:31.229086186 +0000 UTC m=+633.770397525" Mar 12 17:01:31 crc kubenswrapper[5184]: I0312 17:01:31.241186 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f9fdd5dd5-hmzwt" podStartSLOduration=1.27388594 podStartE2EDuration="4.24116753s" podCreationTimestamp="2026-03-12 17:01:27 +0000 UTC" firstStartedPulling="2026-03-12 17:01:27.926676616 +0000 UTC m=+630.467987955" lastFinishedPulling="2026-03-12 17:01:30.893958176 +0000 UTC m=+633.435269545" observedRunningTime="2026-03-12 17:01:31.238928739 +0000 UTC m=+633.780240088" watchObservedRunningTime="2026-03-12 17:01:31.24116753 +0000 UTC m=+633.782478879" Mar 12 17:01:32 crc kubenswrapper[5184]: I0312 17:01:32.220634 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-769f6b94cb-n4sng" event={"ID":"f433a5f6-b857-462d-9896-fcbf044da648","Type":"ContainerStarted","Data":"1389e4e8e9257a393d51c79a093452b173c939a2f7131568888673e49785659b"} Mar 12 17:01:32 crc kubenswrapper[5184]: I0312 17:01:32.220754 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-769f6b94cb-n4sng" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.561677 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-769f6b94cb-n4sng" podStartSLOduration=6.63700202 podStartE2EDuration="10.561645235s" podCreationTimestamp="2026-03-12 17:01:27 +0000 UTC" firstStartedPulling="2026-03-12 17:01:28.014653941 +0000 UTC m=+630.555965270" lastFinishedPulling="2026-03-12 17:01:31.939297146 +0000 UTC m=+634.480608485" observedRunningTime="2026-03-12 17:01:32.246550031 +0000 UTC m=+634.787861410" watchObservedRunningTime="2026-03-12 17:01:37.561645235 +0000 UTC m=+640.102956614" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.566942 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs"] Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.567363 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" containerName="kube-rbac-proxy" containerID="cri-o://0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90" gracePeriod=30 Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.567540 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" containerName="ovnkube-cluster-manager" containerID="cri-o://26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10" gracePeriod=30 Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.762987 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bpj2"] Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.763890 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="nbdb" containerID="cri-o://b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd" gracePeriod=30 Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.764034 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c" gracePeriod=30 Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.764034 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="northd" containerID="cri-o://6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda" gracePeriod=30 Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.764034 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="sbdb" containerID="cri-o://1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f" gracePeriod=30 Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.764134 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="kube-rbac-proxy-node" containerID="cri-o://0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042" gracePeriod=30 Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.764101 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovn-acl-logging" containerID="cri-o://7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe" gracePeriod=30 Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.764916 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovn-controller" containerID="cri-o://bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40" gracePeriod=30 Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.801688 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.810843 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovnkube-controller" containerID="cri-o://22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42" gracePeriod=30 Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.843887 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7"] Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.845076 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" containerName="ovnkube-cluster-manager" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.845226 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" containerName="ovnkube-cluster-manager" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.845363 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" containerName="kube-rbac-proxy" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.845510 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" containerName="kube-rbac-proxy" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.845803 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" containerName="ovnkube-cluster-manager" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.845949 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" containerName="kube-rbac-proxy" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.851068 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.934047 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-env-overrides\") pod \"417740d6-e9c9-4fa8-9811-c6704b5b5692\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.934342 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovnkube-config\") pod \"417740d6-e9c9-4fa8-9811-c6704b5b5692\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.934600 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf2p5\" (UniqueName: \"kubernetes.io/projected/417740d6-e9c9-4fa8-9811-c6704b5b5692-kube-api-access-wf2p5\") pod \"417740d6-e9c9-4fa8-9811-c6704b5b5692\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.934652 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovn-control-plane-metrics-cert\") pod \"417740d6-e9c9-4fa8-9811-c6704b5b5692\" (UID: \"417740d6-e9c9-4fa8-9811-c6704b5b5692\") " Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.934876 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "417740d6-e9c9-4fa8-9811-c6704b5b5692" (UID: "417740d6-e9c9-4fa8-9811-c6704b5b5692"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.934899 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "417740d6-e9c9-4fa8-9811-c6704b5b5692" (UID: "417740d6-e9c9-4fa8-9811-c6704b5b5692"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.934955 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.934978 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x24v\" (UniqueName: \"kubernetes.io/projected/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-kube-api-access-5x24v\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.935032 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.935423 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.935592 5184 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.935609 5184 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.942115 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "417740d6-e9c9-4fa8-9811-c6704b5b5692" (UID: "417740d6-e9c9-4fa8-9811-c6704b5b5692"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:01:37 crc kubenswrapper[5184]: I0312 17:01:37.943812 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/417740d6-e9c9-4fa8-9811-c6704b5b5692-kube-api-access-wf2p5" (OuterVolumeSpecName: "kube-api-access-wf2p5") pod "417740d6-e9c9-4fa8-9811-c6704b5b5692" (UID: "417740d6-e9c9-4fa8-9811-c6704b5b5692"). InnerVolumeSpecName "kube-api-access-wf2p5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.036399 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.036435 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5x24v\" (UniqueName: \"kubernetes.io/projected/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-kube-api-access-5x24v\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.036467 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.036496 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.036536 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wf2p5\" (UniqueName: \"kubernetes.io/projected/417740d6-e9c9-4fa8-9811-c6704b5b5692-kube-api-access-wf2p5\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.036546 5184 reconciler_common.go:299] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417740d6-e9c9-4fa8-9811-c6704b5b5692-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.037173 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.037557 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.040242 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.057732 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x24v\" (UniqueName: \"kubernetes.io/projected/79ba7bc0-9f82-4b02-9f50-f949c46f52a6-kube-api-access-5x24v\") pod \"ovnkube-control-plane-97c9b6c48-n9gf7\" (UID: \"79ba7bc0-9f82-4b02-9f50-f949c46f52a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.060035 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bpj2_a92c8326-e582-4692-8b35-c5d5dbc1ff6c/ovn-acl-logging/0.log" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.060487 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bpj2_a92c8326-e582-4692-8b35-c5d5dbc1ff6c/ovn-controller/0.log" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.060893 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.116602 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jcj28"] Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117149 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="kubecfg-setup" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117168 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="kubecfg-setup" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117183 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovnkube-controller" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117189 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovnkube-controller" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117197 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="northd" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117203 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="northd" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117210 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="nbdb" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117215 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="nbdb" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117224 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117232 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117243 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="sbdb" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117248 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="sbdb" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117257 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovn-acl-logging" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117262 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovn-acl-logging" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117271 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovn-controller" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117276 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovn-controller" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117286 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="kube-rbac-proxy-node" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117291 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="kube-rbac-proxy-node" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117411 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovn-acl-logging" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117422 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovn-controller" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117430 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="sbdb" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117437 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="northd" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117444 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="nbdb" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117451 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="ovnkube-controller" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117459 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="kube-rbac-proxy-node" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.117467 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.122874 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.137998 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-etc-openvswitch\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138039 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-bin\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138083 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-var-lib-openvswitch\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138121 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz4cp\" (UniqueName: \"kubernetes.io/projected/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-kube-api-access-qz4cp\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138156 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-openvswitch\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138188 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-node-log\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138239 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-ovn-kubernetes\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138263 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovn-node-metrics-cert\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138306 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-ovn\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138335 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-env-overrides\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138350 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-kubelet\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138403 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-netd\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138416 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-systemd\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138469 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-log-socket\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138504 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-systemd-units\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138549 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-script-lib\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138563 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138585 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-config\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138625 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-netns\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138649 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-slash\") pod \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\" (UID: \"a92c8326-e582-4692-8b35-c5d5dbc1ff6c\") " Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138662 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138709 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138727 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138786 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-slash" (OuterVolumeSpecName: "host-slash") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.138992 5184 reconciler_common.go:299] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.139004 5184 reconciler_common.go:299] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-slash\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.139014 5184 reconciler_common.go:299] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.139024 5184 reconciler_common.go:299] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.139045 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.139452 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.139508 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.139528 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.141302 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.141338 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-node-log" (OuterVolumeSpecName: "node-log") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.141462 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.141455 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-log-socket" (OuterVolumeSpecName: "log-socket") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.141499 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.141967 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.142007 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.142474 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.143169 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.143291 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-kube-api-access-qz4cp" (OuterVolumeSpecName: "kube-api-access-qz4cp") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "kube-api-access-qz4cp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.146722 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.154805 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a92c8326-e582-4692-8b35-c5d5dbc1ff6c" (UID: "a92c8326-e582-4692-8b35-c5d5dbc1ff6c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.218712 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.233113 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-769f6b94cb-n4sng" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.240067 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-node-log\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.240271 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqm2\" (UniqueName: \"kubernetes.io/projected/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-kube-api-access-bdqm2\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.240474 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-var-lib-openvswitch\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.240631 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.240787 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-env-overrides\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.240927 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-kubelet\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.241097 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-slash\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.241349 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-systemd-units\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.241614 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-run-netns\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.241980 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-cni-netd\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.242156 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-ovnkube-config\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.242343 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-run-ovn\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.242583 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-log-socket\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.242734 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-run-systemd\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.242887 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-etc-openvswitch\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.243104 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-cni-bin\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.243321 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-run-ovn-kubernetes\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.243488 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-ovn-node-metrics-cert\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.243652 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-run-openvswitch\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.243896 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-ovnkube-script-lib\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244130 5184 reconciler_common.go:299] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244178 5184 reconciler_common.go:299] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244205 5184 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244236 5184 reconciler_common.go:299] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244262 5184 reconciler_common.go:299] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244286 5184 reconciler_common.go:299] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244317 5184 reconciler_common.go:299] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-log-socket\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244344 5184 reconciler_common.go:299] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244370 5184 reconciler_common.go:299] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244451 5184 reconciler_common.go:299] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244478 5184 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244504 5184 reconciler_common.go:299] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244532 5184 reconciler_common.go:299] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244558 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qz4cp\" (UniqueName: \"kubernetes.io/projected/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-kube-api-access-qz4cp\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244583 5184 reconciler_common.go:299] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.244608 5184 reconciler_common.go:299] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a92c8326-e582-4692-8b35-c5d5dbc1ff6c-node-log\") on node \"crc\" DevicePath \"\"" Mar 12 17:01:38 crc kubenswrapper[5184]: W0312 17:01:38.257073 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ba7bc0_9f82_4b02_9f50_f949c46f52a6.slice/crio-1feb42e11399512c477ddcd5c3f44784766a5d7e1f3c5b90f27c8806b7e23b8f WatchSource:0}: Error finding container 1feb42e11399512c477ddcd5c3f44784766a5d7e1f3c5b90f27c8806b7e23b8f: Status 404 returned error can't find the container with id 1feb42e11399512c477ddcd5c3f44784766a5d7e1f3c5b90f27c8806b7e23b8f Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.287130 5184 generic.go:358] "Generic (PLEG): container finished" podID="417740d6-e9c9-4fa8-9811-c6704b5b5692" containerID="26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10" exitCode=0 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.287167 5184 generic.go:358] "Generic (PLEG): container finished" podID="417740d6-e9c9-4fa8-9811-c6704b5b5692" containerID="0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90" exitCode=0 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.287202 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" event={"ID":"417740d6-e9c9-4fa8-9811-c6704b5b5692","Type":"ContainerDied","Data":"26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.287232 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.287298 5184 scope.go:117] "RemoveContainer" containerID="26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.287278 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" event={"ID":"417740d6-e9c9-4fa8-9811-c6704b5b5692","Type":"ContainerDied","Data":"0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.287480 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs" event={"ID":"417740d6-e9c9-4fa8-9811-c6704b5b5692","Type":"ContainerDied","Data":"6d4c192358714a12a140a09e2a1717ce35ef7ba0d88bbbf965f3d237341e2cea"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.293304 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bpj2_a92c8326-e582-4692-8b35-c5d5dbc1ff6c/ovn-acl-logging/0.log" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.295237 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bpj2_a92c8326-e582-4692-8b35-c5d5dbc1ff6c/ovn-controller/0.log" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296640 5184 generic.go:358] "Generic (PLEG): container finished" podID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerID="22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42" exitCode=0 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296688 5184 generic.go:358] "Generic (PLEG): container finished" podID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerID="1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f" exitCode=0 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296710 5184 generic.go:358] "Generic (PLEG): container finished" podID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerID="b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd" exitCode=0 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296728 5184 generic.go:358] "Generic (PLEG): container finished" podID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerID="6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda" exitCode=0 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296745 5184 generic.go:358] "Generic (PLEG): container finished" podID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerID="137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c" exitCode=0 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296764 5184 generic.go:358] "Generic (PLEG): container finished" podID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerID="0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042" exitCode=0 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296779 5184 generic.go:358] "Generic (PLEG): container finished" podID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerID="7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe" exitCode=143 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296796 5184 generic.go:358] "Generic (PLEG): container finished" podID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" containerID="bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40" exitCode=143 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296787 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296809 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerDied","Data":"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296932 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerDied","Data":"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296956 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerDied","Data":"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296977 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerDied","Data":"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.296996 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerDied","Data":"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297024 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerDied","Data":"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297045 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297061 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297074 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297084 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297095 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297105 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297115 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297125 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297134 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297148 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerDied","Data":"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297163 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297175 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297185 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297195 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297205 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297214 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297224 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297234 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297243 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297259 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerDied","Data":"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297273 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297285 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297294 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297304 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297314 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297323 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297332 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297343 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297352 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297365 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bpj2" event={"ID":"a92c8326-e582-4692-8b35-c5d5dbc1ff6c","Type":"ContainerDied","Data":"8a4ae9c85a6a7f907c79ddfdbd329c4856ddbf46112ed3682519102c877c78d1"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297409 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297421 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297430 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297440 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297450 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297460 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297469 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297478 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.297488 5184 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.301813 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" event={"ID":"79ba7bc0-9f82-4b02-9f50-f949c46f52a6","Type":"ContainerStarted","Data":"1feb42e11399512c477ddcd5c3f44784766a5d7e1f3c5b90f27c8806b7e23b8f"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.303802 5184 scope.go:117] "RemoveContainer" containerID="0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.305264 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.305308 5184 generic.go:358] "Generic (PLEG): container finished" podID="542903c2-fc88-4085-979a-db3766958392" containerID="1c3df7e5ebfd17fac7029a70d11086adf8244115be119b9f83d90982ffede7fd" exitCode=2 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.305439 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-99gtj" event={"ID":"542903c2-fc88-4085-979a-db3766958392","Type":"ContainerDied","Data":"1c3df7e5ebfd17fac7029a70d11086adf8244115be119b9f83d90982ffede7fd"} Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.306035 5184 scope.go:117] "RemoveContainer" containerID="1c3df7e5ebfd17fac7029a70d11086adf8244115be119b9f83d90982ffede7fd" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.326118 5184 scope.go:117] "RemoveContainer" containerID="26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.326303 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs"] Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.329797 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10\": container with ID starting with 26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10 not found: ID does not exist" containerID="26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.329834 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-wqfhs"] Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.329836 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10"} err="failed to get container status \"26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10\": rpc error: code = NotFound desc = could not find container \"26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10\": container with ID starting with 26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.329881 5184 scope.go:117] "RemoveContainer" containerID="0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90" Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.330921 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90\": container with ID starting with 0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90 not found: ID does not exist" containerID="0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.330947 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90"} err="failed to get container status \"0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90\": rpc error: code = NotFound desc = could not find container \"0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90\": container with ID starting with 0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.330964 5184 scope.go:117] "RemoveContainer" containerID="26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.331357 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10"} err="failed to get container status \"26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10\": rpc error: code = NotFound desc = could not find container \"26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10\": container with ID starting with 26e6ed010b3a45f9e22b53a634f57881a78d7fcc479f407d448349cc1784ad10 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.331401 5184 scope.go:117] "RemoveContainer" containerID="0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.331652 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90"} err="failed to get container status \"0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90\": rpc error: code = NotFound desc = could not find container \"0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90\": container with ID starting with 0decdc2957500fde0ebec04f2af81d8693c23a402cc1e67269ddab6d50d45e90 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.331677 5184 scope.go:117] "RemoveContainer" containerID="22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345587 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-var-lib-openvswitch\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345628 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345661 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-env-overrides\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345682 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-kubelet\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345693 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-var-lib-openvswitch\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345722 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-slash\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345771 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-slash\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345830 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345906 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-systemd-units\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345945 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-run-netns\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345976 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-cni-netd\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.345996 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-ovnkube-config\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346345 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-systemd-units\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346403 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-kubelet\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346423 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-run-ovn\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346469 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-log-socket\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346499 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-run-systemd\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346531 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-etc-openvswitch\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346559 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-cni-bin\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346615 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-run-ovn-kubernetes\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346619 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-run-systemd\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346646 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-ovn-node-metrics-cert\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346685 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-log-socket\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346689 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-run-openvswitch\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346730 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-run-openvswitch\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346556 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-run-netns\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346775 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-ovnkube-script-lib\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346788 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-run-ovn\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346852 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-run-ovn-kubernetes\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346870 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-cni-bin\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346886 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-etc-openvswitch\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346915 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-env-overrides\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346935 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-node-log\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.346962 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-node-log\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.347123 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqm2\" (UniqueName: \"kubernetes.io/projected/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-kube-api-access-bdqm2\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.347641 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-ovnkube-config\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.347994 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-host-cni-netd\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.348354 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-ovnkube-script-lib\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.351750 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-ovn-node-metrics-cert\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.361836 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bpj2"] Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.362012 5184 scope.go:117] "RemoveContainer" containerID="1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.370517 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6bpj2"] Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.370807 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqm2\" (UniqueName: \"kubernetes.io/projected/6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588-kube-api-access-bdqm2\") pod \"ovnkube-node-jcj28\" (UID: \"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588\") " pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.379626 5184 scope.go:117] "RemoveContainer" containerID="b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.401072 5184 scope.go:117] "RemoveContainer" containerID="6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.409919 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="417740d6-e9c9-4fa8-9811-c6704b5b5692" path="/var/lib/kubelet/pods/417740d6-e9c9-4fa8-9811-c6704b5b5692/volumes" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.410823 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a92c8326-e582-4692-8b35-c5d5dbc1ff6c" path="/var/lib/kubelet/pods/a92c8326-e582-4692-8b35-c5d5dbc1ff6c/volumes" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.431102 5184 scope.go:117] "RemoveContainer" containerID="137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.436550 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.460395 5184 scope.go:117] "RemoveContainer" containerID="0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042" Mar 12 17:01:38 crc kubenswrapper[5184]: W0312 17:01:38.471037 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e7aa7b1_1a5d_48ab_9ed4_5e545bff3588.slice/crio-f0bebe3ac78814dec43bdf01d29853c16189638185236296e099c0d06fecd338 WatchSource:0}: Error finding container f0bebe3ac78814dec43bdf01d29853c16189638185236296e099c0d06fecd338: Status 404 returned error can't find the container with id f0bebe3ac78814dec43bdf01d29853c16189638185236296e099c0d06fecd338 Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.477356 5184 scope.go:117] "RemoveContainer" containerID="7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.495356 5184 scope.go:117] "RemoveContainer" containerID="bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.520133 5184 scope.go:117] "RemoveContainer" containerID="a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.536944 5184 scope.go:117] "RemoveContainer" containerID="22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42" Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.537298 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42\": container with ID starting with 22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42 not found: ID does not exist" containerID="22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.537339 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42"} err="failed to get container status \"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42\": rpc error: code = NotFound desc = could not find container \"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42\": container with ID starting with 22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.537359 5184 scope.go:117] "RemoveContainer" containerID="1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f" Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.537941 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f\": container with ID starting with 1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f not found: ID does not exist" containerID="1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.537964 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f"} err="failed to get container status \"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f\": rpc error: code = NotFound desc = could not find container \"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f\": container with ID starting with 1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.537978 5184 scope.go:117] "RemoveContainer" containerID="b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd" Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.538425 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd\": container with ID starting with b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd not found: ID does not exist" containerID="b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.538486 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd"} err="failed to get container status \"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd\": rpc error: code = NotFound desc = could not find container \"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd\": container with ID starting with b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.538525 5184 scope.go:117] "RemoveContainer" containerID="6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda" Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.539169 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda\": container with ID starting with 6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda not found: ID does not exist" containerID="6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.539193 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda"} err="failed to get container status \"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda\": rpc error: code = NotFound desc = could not find container \"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda\": container with ID starting with 6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.539206 5184 scope.go:117] "RemoveContainer" containerID="137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c" Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.540829 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c\": container with ID starting with 137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c not found: ID does not exist" containerID="137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.540858 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c"} err="failed to get container status \"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c\": rpc error: code = NotFound desc = could not find container \"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c\": container with ID starting with 137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.540876 5184 scope.go:117] "RemoveContainer" containerID="0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042" Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.541115 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042\": container with ID starting with 0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042 not found: ID does not exist" containerID="0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.541140 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042"} err="failed to get container status \"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042\": rpc error: code = NotFound desc = could not find container \"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042\": container with ID starting with 0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.541156 5184 scope.go:117] "RemoveContainer" containerID="7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe" Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.541517 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe\": container with ID starting with 7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe not found: ID does not exist" containerID="7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.541540 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe"} err="failed to get container status \"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe\": rpc error: code = NotFound desc = could not find container \"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe\": container with ID starting with 7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.541554 5184 scope.go:117] "RemoveContainer" containerID="bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40" Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.541877 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40\": container with ID starting with bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40 not found: ID does not exist" containerID="bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.541894 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40"} err="failed to get container status \"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40\": rpc error: code = NotFound desc = could not find container \"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40\": container with ID starting with bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.541905 5184 scope.go:117] "RemoveContainer" containerID="a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551" Mar 12 17:01:38 crc kubenswrapper[5184]: E0312 17:01:38.542152 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\": container with ID starting with a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551 not found: ID does not exist" containerID="a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.542179 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551"} err="failed to get container status \"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\": rpc error: code = NotFound desc = could not find container \"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\": container with ID starting with a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.542195 5184 scope.go:117] "RemoveContainer" containerID="22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.542403 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42"} err="failed to get container status \"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42\": rpc error: code = NotFound desc = could not find container \"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42\": container with ID starting with 22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.542428 5184 scope.go:117] "RemoveContainer" containerID="1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.542611 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f"} err="failed to get container status \"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f\": rpc error: code = NotFound desc = could not find container \"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f\": container with ID starting with 1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.542625 5184 scope.go:117] "RemoveContainer" containerID="b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.542811 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd"} err="failed to get container status \"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd\": rpc error: code = NotFound desc = could not find container \"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd\": container with ID starting with b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.542831 5184 scope.go:117] "RemoveContainer" containerID="6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.543058 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda"} err="failed to get container status \"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda\": rpc error: code = NotFound desc = could not find container \"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda\": container with ID starting with 6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.543071 5184 scope.go:117] "RemoveContainer" containerID="137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.543308 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c"} err="failed to get container status \"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c\": rpc error: code = NotFound desc = could not find container \"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c\": container with ID starting with 137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.543328 5184 scope.go:117] "RemoveContainer" containerID="0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.543772 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042"} err="failed to get container status \"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042\": rpc error: code = NotFound desc = could not find container \"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042\": container with ID starting with 0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.543789 5184 scope.go:117] "RemoveContainer" containerID="7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.543986 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe"} err="failed to get container status \"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe\": rpc error: code = NotFound desc = could not find container \"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe\": container with ID starting with 7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544008 5184 scope.go:117] "RemoveContainer" containerID="bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544189 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40"} err="failed to get container status \"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40\": rpc error: code = NotFound desc = could not find container \"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40\": container with ID starting with bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544201 5184 scope.go:117] "RemoveContainer" containerID="a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544424 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551"} err="failed to get container status \"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\": rpc error: code = NotFound desc = could not find container \"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\": container with ID starting with a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544446 5184 scope.go:117] "RemoveContainer" containerID="22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544612 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42"} err="failed to get container status \"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42\": rpc error: code = NotFound desc = could not find container \"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42\": container with ID starting with 22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544623 5184 scope.go:117] "RemoveContainer" containerID="1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544775 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f"} err="failed to get container status \"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f\": rpc error: code = NotFound desc = could not find container \"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f\": container with ID starting with 1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544787 5184 scope.go:117] "RemoveContainer" containerID="b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544939 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd"} err="failed to get container status \"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd\": rpc error: code = NotFound desc = could not find container \"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd\": container with ID starting with b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.544950 5184 scope.go:117] "RemoveContainer" containerID="6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.545585 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda"} err="failed to get container status \"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda\": rpc error: code = NotFound desc = could not find container \"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda\": container with ID starting with 6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.545681 5184 scope.go:117] "RemoveContainer" containerID="137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.546277 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c"} err="failed to get container status \"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c\": rpc error: code = NotFound desc = could not find container \"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c\": container with ID starting with 137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.546305 5184 scope.go:117] "RemoveContainer" containerID="0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.551041 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042"} err="failed to get container status \"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042\": rpc error: code = NotFound desc = could not find container \"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042\": container with ID starting with 0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.551080 5184 scope.go:117] "RemoveContainer" containerID="7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.551442 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe"} err="failed to get container status \"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe\": rpc error: code = NotFound desc = could not find container \"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe\": container with ID starting with 7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.551465 5184 scope.go:117] "RemoveContainer" containerID="bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.551771 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40"} err="failed to get container status \"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40\": rpc error: code = NotFound desc = could not find container \"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40\": container with ID starting with bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.551796 5184 scope.go:117] "RemoveContainer" containerID="a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.552252 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551"} err="failed to get container status \"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\": rpc error: code = NotFound desc = could not find container \"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\": container with ID starting with a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.552349 5184 scope.go:117] "RemoveContainer" containerID="22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.552920 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42"} err="failed to get container status \"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42\": rpc error: code = NotFound desc = could not find container \"22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42\": container with ID starting with 22829e8a929348996fb7656dc6de7050f92b0f8399db8f10683c6d95cc1d8c42 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.552949 5184 scope.go:117] "RemoveContainer" containerID="1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.553418 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f"} err="failed to get container status \"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f\": rpc error: code = NotFound desc = could not find container \"1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f\": container with ID starting with 1b21854960b562aaf97c4c6926185e03be4f730eeb15c267c9390b7742c5ab5f not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.553449 5184 scope.go:117] "RemoveContainer" containerID="b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.553999 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd"} err="failed to get container status \"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd\": rpc error: code = NotFound desc = could not find container \"b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd\": container with ID starting with b5551614d52dd3e27aff5fc716fc02737aaff91dccc1b0d3189d3551acee14bd not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.554038 5184 scope.go:117] "RemoveContainer" containerID="6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.554781 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda"} err="failed to get container status \"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda\": rpc error: code = NotFound desc = could not find container \"6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda\": container with ID starting with 6434122f7c641858f5b536148b7b16ffbae6899c622187bdf0cf6950a6443cda not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.554808 5184 scope.go:117] "RemoveContainer" containerID="137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.555641 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c"} err="failed to get container status \"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c\": rpc error: code = NotFound desc = could not find container \"137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c\": container with ID starting with 137805551ee1701e7b11914be119573edd67e16a5d17f27e8db1afd4193c704c not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.555666 5184 scope.go:117] "RemoveContainer" containerID="0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.556227 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042"} err="failed to get container status \"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042\": rpc error: code = NotFound desc = could not find container \"0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042\": container with ID starting with 0842b1614675208f999dbcbfd017fda8250915476eccfdae1fb82b967f386042 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.556267 5184 scope.go:117] "RemoveContainer" containerID="7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.556910 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe"} err="failed to get container status \"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe\": rpc error: code = NotFound desc = could not find container \"7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe\": container with ID starting with 7f9bcef2ca3e2408e97837aedca5bd6c0be5c1f90a4ef1c715600d6c0e5e4efe not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.556953 5184 scope.go:117] "RemoveContainer" containerID="bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.557423 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40"} err="failed to get container status \"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40\": rpc error: code = NotFound desc = could not find container \"bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40\": container with ID starting with bd6d5062f62d0471be109f59bfefc90111d17b18d4af16fedb005bd7ae2e6e40 not found: ID does not exist" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.557448 5184 scope.go:117] "RemoveContainer" containerID="a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551" Mar 12 17:01:38 crc kubenswrapper[5184]: I0312 17:01:38.557796 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551"} err="failed to get container status \"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\": rpc error: code = NotFound desc = could not find container \"a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551\": container with ID starting with a19a0220ad87c86b0c2e0d6f93d7b2e4b08fc8eb5904bcd2909fd4891bda4551 not found: ID does not exist" Mar 12 17:01:39 crc kubenswrapper[5184]: I0312 17:01:39.319110 5184 generic.go:358] "Generic (PLEG): container finished" podID="6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588" containerID="0c2de20ede527d030e980b33cd5d5efb25bf62c96f26e844c9a6e14664e54237" exitCode=0 Mar 12 17:01:39 crc kubenswrapper[5184]: I0312 17:01:39.319223 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" event={"ID":"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588","Type":"ContainerDied","Data":"0c2de20ede527d030e980b33cd5d5efb25bf62c96f26e844c9a6e14664e54237"} Mar 12 17:01:39 crc kubenswrapper[5184]: I0312 17:01:39.319274 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" event={"ID":"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588","Type":"ContainerStarted","Data":"f0bebe3ac78814dec43bdf01d29853c16189638185236296e099c0d06fecd338"} Mar 12 17:01:39 crc kubenswrapper[5184]: I0312 17:01:39.330262 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" event={"ID":"79ba7bc0-9f82-4b02-9f50-f949c46f52a6","Type":"ContainerStarted","Data":"37d83fdfe1ee7c50e9a6cb9f83a7fca926ff5726a2bb608a4f358c2f156c8f0e"} Mar 12 17:01:39 crc kubenswrapper[5184]: I0312 17:01:39.330327 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" event={"ID":"79ba7bc0-9f82-4b02-9f50-f949c46f52a6","Type":"ContainerStarted","Data":"fc1f1874992a134e7212f289b50957db0447965edfa00c27b58fb16997bd35b3"} Mar 12 17:01:39 crc kubenswrapper[5184]: I0312 17:01:39.338339 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:01:39 crc kubenswrapper[5184]: I0312 17:01:39.338424 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-99gtj" event={"ID":"542903c2-fc88-4085-979a-db3766958392","Type":"ContainerStarted","Data":"99435138c107a44be2a37a4680cfc8abfb6adfa8c538a4feee9b7832f9c3c4d1"} Mar 12 17:01:39 crc kubenswrapper[5184]: I0312 17:01:39.426988 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-n9gf7" podStartSLOduration=2.426955285 podStartE2EDuration="2.426955285s" podCreationTimestamp="2026-03-12 17:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:01:39.422629627 +0000 UTC m=+641.963940996" watchObservedRunningTime="2026-03-12 17:01:39.426955285 +0000 UTC m=+641.968266654" Mar 12 17:01:40 crc kubenswrapper[5184]: I0312 17:01:40.349071 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" event={"ID":"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588","Type":"ContainerStarted","Data":"52fa0d6bb9da5179c95377523bb400f5e686a69220aa7f0e6eabc5de8e71e811"} Mar 12 17:01:40 crc kubenswrapper[5184]: I0312 17:01:40.349537 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" event={"ID":"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588","Type":"ContainerStarted","Data":"ccd81d21e431395db408ab10a5eaff99235a8fc186746615f2ab01ef24a221c9"} Mar 12 17:01:40 crc kubenswrapper[5184]: I0312 17:01:40.349563 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" event={"ID":"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588","Type":"ContainerStarted","Data":"1294b4e04d58d28914c55627e2e7d9c3de62f46308946cd691f04a2970de7bd8"} Mar 12 17:01:40 crc kubenswrapper[5184]: I0312 17:01:40.349584 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" event={"ID":"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588","Type":"ContainerStarted","Data":"6a5a9add2927f2e61e50d0f96f51b6ac8aa8b72ed8261548bbaecfd6cd058d6a"} Mar 12 17:01:40 crc kubenswrapper[5184]: I0312 17:01:40.349603 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" event={"ID":"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588","Type":"ContainerStarted","Data":"bd96d82ce7b8dc9881c10c77dbcaac2c44a345594ac684a6b6b303bbf2740a1b"} Mar 12 17:01:40 crc kubenswrapper[5184]: I0312 17:01:40.349620 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" event={"ID":"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588","Type":"ContainerStarted","Data":"06030a0e9f88aef9b677a210f3300d143bc9b3940004ebb2d454ba7fff990683"} Mar 12 17:01:43 crc kubenswrapper[5184]: I0312 17:01:43.377490 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" event={"ID":"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588","Type":"ContainerStarted","Data":"42737e1ac878bcacb579547bc5aa39204531f6df53e502ce30a1777f5bdb4bac"} Mar 12 17:01:45 crc kubenswrapper[5184]: I0312 17:01:45.409722 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" event={"ID":"6e7aa7b1-1a5d-48ab-9ed4-5e545bff3588","Type":"ContainerStarted","Data":"c52b1bd9f531c02cb324c008c1af6ab198263ea09646437c6e8acd7ba2425080"} Mar 12 17:01:45 crc kubenswrapper[5184]: I0312 17:01:45.410367 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:45 crc kubenswrapper[5184]: I0312 17:01:45.410430 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:45 crc kubenswrapper[5184]: I0312 17:01:45.410448 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:45 crc kubenswrapper[5184]: I0312 17:01:45.462027 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:45 crc kubenswrapper[5184]: I0312 17:01:45.474479 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:01:45 crc kubenswrapper[5184]: I0312 17:01:45.474884 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" podStartSLOduration=7.474857806 podStartE2EDuration="7.474857806s" podCreationTimestamp="2026-03-12 17:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:01:45.468066899 +0000 UTC m=+648.009378288" watchObservedRunningTime="2026-03-12 17:01:45.474857806 +0000 UTC m=+648.016169165" Mar 12 17:01:50 crc kubenswrapper[5184]: I0312 17:01:50.224322 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d9d95bf5b-47lwd" Mar 12 17:01:50 crc kubenswrapper[5184]: I0312 17:01:50.317292 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-dlsx9"] Mar 12 17:01:50 crc kubenswrapper[5184]: I0312 17:01:50.742680 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:01:50 crc kubenswrapper[5184]: I0312 17:01:50.743070 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.133421 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555582-vm6nb"] Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.150581 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555582-vm6nb"] Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.150730 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555582-vm6nb" Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.155565 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.157416 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.157513 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.286445 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmd2d\" (UniqueName: \"kubernetes.io/projected/92eba052-9e58-4d2a-a414-529b609345da-kube-api-access-qmd2d\") pod \"auto-csr-approver-29555582-vm6nb\" (UID: \"92eba052-9e58-4d2a-a414-529b609345da\") " pod="openshift-infra/auto-csr-approver-29555582-vm6nb" Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.388519 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmd2d\" (UniqueName: \"kubernetes.io/projected/92eba052-9e58-4d2a-a414-529b609345da-kube-api-access-qmd2d\") pod \"auto-csr-approver-29555582-vm6nb\" (UID: \"92eba052-9e58-4d2a-a414-529b609345da\") " pod="openshift-infra/auto-csr-approver-29555582-vm6nb" Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.438769 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmd2d\" (UniqueName: \"kubernetes.io/projected/92eba052-9e58-4d2a-a414-529b609345da-kube-api-access-qmd2d\") pod \"auto-csr-approver-29555582-vm6nb\" (UID: \"92eba052-9e58-4d2a-a414-529b609345da\") " pod="openshift-infra/auto-csr-approver-29555582-vm6nb" Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.470839 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555582-vm6nb" Mar 12 17:02:00 crc kubenswrapper[5184]: I0312 17:02:00.962398 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555582-vm6nb"] Mar 12 17:02:01 crc kubenswrapper[5184]: I0312 17:02:01.559267 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555582-vm6nb" event={"ID":"92eba052-9e58-4d2a-a414-529b609345da","Type":"ContainerStarted","Data":"82c7336be46c2d504d225dd85919751c2dc398a46d5a4669ee2c65064f659b1c"} Mar 12 17:02:02 crc kubenswrapper[5184]: I0312 17:02:02.570862 5184 generic.go:358] "Generic (PLEG): container finished" podID="92eba052-9e58-4d2a-a414-529b609345da" containerID="af482bb2174d12ea7babb868743ac582cb8e3bc181395e34bde3fe0d1712213f" exitCode=0 Mar 12 17:02:02 crc kubenswrapper[5184]: I0312 17:02:02.571018 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555582-vm6nb" event={"ID":"92eba052-9e58-4d2a-a414-529b609345da","Type":"ContainerDied","Data":"af482bb2174d12ea7babb868743ac582cb8e3bc181395e34bde3fe0d1712213f"} Mar 12 17:02:03 crc kubenswrapper[5184]: I0312 17:02:03.859210 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555582-vm6nb" Mar 12 17:02:03 crc kubenswrapper[5184]: I0312 17:02:03.946778 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmd2d\" (UniqueName: \"kubernetes.io/projected/92eba052-9e58-4d2a-a414-529b609345da-kube-api-access-qmd2d\") pod \"92eba052-9e58-4d2a-a414-529b609345da\" (UID: \"92eba052-9e58-4d2a-a414-529b609345da\") " Mar 12 17:02:03 crc kubenswrapper[5184]: I0312 17:02:03.967682 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92eba052-9e58-4d2a-a414-529b609345da-kube-api-access-qmd2d" (OuterVolumeSpecName: "kube-api-access-qmd2d") pod "92eba052-9e58-4d2a-a414-529b609345da" (UID: "92eba052-9e58-4d2a-a414-529b609345da"). InnerVolumeSpecName "kube-api-access-qmd2d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:02:04 crc kubenswrapper[5184]: I0312 17:02:04.049207 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qmd2d\" (UniqueName: \"kubernetes.io/projected/92eba052-9e58-4d2a-a414-529b609345da-kube-api-access-qmd2d\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:04 crc kubenswrapper[5184]: I0312 17:02:04.591530 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555582-vm6nb" event={"ID":"92eba052-9e58-4d2a-a414-529b609345da","Type":"ContainerDied","Data":"82c7336be46c2d504d225dd85919751c2dc398a46d5a4669ee2c65064f659b1c"} Mar 12 17:02:04 crc kubenswrapper[5184]: I0312 17:02:04.591881 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82c7336be46c2d504d225dd85919751c2dc398a46d5a4669ee2c65064f659b1c" Mar 12 17:02:04 crc kubenswrapper[5184]: I0312 17:02:04.591542 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555582-vm6nb" Mar 12 17:02:04 crc kubenswrapper[5184]: I0312 17:02:04.938784 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555576-8pcks"] Mar 12 17:02:04 crc kubenswrapper[5184]: I0312 17:02:04.946157 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555576-8pcks"] Mar 12 17:02:06 crc kubenswrapper[5184]: I0312 17:02:06.412048 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b9baef-c7b2-4789-a606-0b76f4a575c4" path="/var/lib/kubelet/pods/b4b9baef-c7b2-4789-a606-0b76f4a575c4/volumes" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.356340 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" podUID="82e2099d-a6d8-488e-8144-b2ed728725e2" containerName="registry" containerID="cri-o://e32c6befe877e79950bc18dd6b9d090f73d6591d5932b62afd6e308cc060228f" gracePeriod=30 Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.670519 5184 generic.go:358] "Generic (PLEG): container finished" podID="82e2099d-a6d8-488e-8144-b2ed728725e2" containerID="e32c6befe877e79950bc18dd6b9d090f73d6591d5932b62afd6e308cc060228f" exitCode=0 Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.670677 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" event={"ID":"82e2099d-a6d8-488e-8144-b2ed728725e2","Type":"ContainerDied","Data":"e32c6befe877e79950bc18dd6b9d090f73d6591d5932b62afd6e308cc060228f"} Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.671153 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" event={"ID":"82e2099d-a6d8-488e-8144-b2ed728725e2","Type":"ContainerDied","Data":"7da9fa4af14f36d725116a220038f3598cc50c5c0345f1430e6a864cfecd920e"} Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.671183 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7da9fa4af14f36d725116a220038f3598cc50c5c0345f1430e6a864cfecd920e" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.705981 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.827863 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82e2099d-a6d8-488e-8144-b2ed728725e2-installation-pull-secrets\") pod \"82e2099d-a6d8-488e-8144-b2ed728725e2\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.827923 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-certificates\") pod \"82e2099d-a6d8-488e-8144-b2ed728725e2\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.827974 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-bound-sa-token\") pod \"82e2099d-a6d8-488e-8144-b2ed728725e2\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.828021 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt7xx\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-kube-api-access-tt7xx\") pod \"82e2099d-a6d8-488e-8144-b2ed728725e2\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.828052 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-trusted-ca\") pod \"82e2099d-a6d8-488e-8144-b2ed728725e2\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.828078 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82e2099d-a6d8-488e-8144-b2ed728725e2-ca-trust-extracted\") pod \"82e2099d-a6d8-488e-8144-b2ed728725e2\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.828118 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-tls\") pod \"82e2099d-a6d8-488e-8144-b2ed728725e2\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.828328 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"82e2099d-a6d8-488e-8144-b2ed728725e2\" (UID: \"82e2099d-a6d8-488e-8144-b2ed728725e2\") " Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.828806 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "82e2099d-a6d8-488e-8144-b2ed728725e2" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.829346 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "82e2099d-a6d8-488e-8144-b2ed728725e2" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.834426 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "82e2099d-a6d8-488e-8144-b2ed728725e2" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.836485 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-kube-api-access-tt7xx" (OuterVolumeSpecName: "kube-api-access-tt7xx") pod "82e2099d-a6d8-488e-8144-b2ed728725e2" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2"). InnerVolumeSpecName "kube-api-access-tt7xx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.836654 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "82e2099d-a6d8-488e-8144-b2ed728725e2" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.836959 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e2099d-a6d8-488e-8144-b2ed728725e2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "82e2099d-a6d8-488e-8144-b2ed728725e2" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.840470 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (OuterVolumeSpecName: "registry-storage") pod "82e2099d-a6d8-488e-8144-b2ed728725e2" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2"). InnerVolumeSpecName "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2". PluginName "kubernetes.io/csi", VolumeGIDValue "" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.849924 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82e2099d-a6d8-488e-8144-b2ed728725e2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "82e2099d-a6d8-488e-8144-b2ed728725e2" (UID: "82e2099d-a6d8-488e-8144-b2ed728725e2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.929737 5184 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.930038 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tt7xx\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-kube-api-access-tt7xx\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.930055 5184 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.930064 5184 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82e2099d-a6d8-488e-8144-b2ed728725e2-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.930072 5184 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.930080 5184 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82e2099d-a6d8-488e-8144-b2ed728725e2-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:15 crc kubenswrapper[5184]: I0312 17:02:15.930089 5184 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82e2099d-a6d8-488e-8144-b2ed728725e2-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.073918 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv"] Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.074595 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82e2099d-a6d8-488e-8144-b2ed728725e2" containerName="registry" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.074617 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e2099d-a6d8-488e-8144-b2ed728725e2" containerName="registry" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.074643 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92eba052-9e58-4d2a-a414-529b609345da" containerName="oc" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.074653 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="92eba052-9e58-4d2a-a414-529b609345da" containerName="oc" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.074787 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="82e2099d-a6d8-488e-8144-b2ed728725e2" containerName="registry" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.074807 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="92eba052-9e58-4d2a-a414-529b609345da" containerName="oc" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.087337 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv"] Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.087520 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.094940 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2ccr\"" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.233576 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-bundle\") pod \"d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.233925 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-util\") pod \"d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.234208 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqbnx\" (UniqueName: \"kubernetes.io/projected/929a96db-d440-4095-bfed-bc35b90447eb-kube-api-access-rqbnx\") pod \"d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.335589 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-bundle\") pod \"d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.335930 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-util\") pod \"d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.336027 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqbnx\" (UniqueName: \"kubernetes.io/projected/929a96db-d440-4095-bfed-bc35b90447eb-kube-api-access-rqbnx\") pod \"d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.336914 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-util\") pod \"d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.337187 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-bundle\") pod \"d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.358474 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqbnx\" (UniqueName: \"kubernetes.io/projected/929a96db-d440-4095-bfed-bc35b90447eb-kube-api-access-rqbnx\") pod \"d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.417510 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.648525 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv"] Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.685631 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" event={"ID":"929a96db-d440-4095-bfed-bc35b90447eb","Type":"ContainerStarted","Data":"42dcd7d63355dab012993fe14181a076849101a7462a24414372261675c34c74"} Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.685719 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-dlsx9" Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.722230 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-dlsx9"] Mar 12 17:02:16 crc kubenswrapper[5184]: I0312 17:02:16.730889 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-dlsx9"] Mar 12 17:02:17 crc kubenswrapper[5184]: I0312 17:02:17.486105 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jcj28" Mar 12 17:02:17 crc kubenswrapper[5184]: I0312 17:02:17.691790 5184 generic.go:358] "Generic (PLEG): container finished" podID="929a96db-d440-4095-bfed-bc35b90447eb" containerID="6c52a65629d6a83860bb6864ccc9d8f9b44c88c09d70aa75e7902e09d53fea5d" exitCode=0 Mar 12 17:02:17 crc kubenswrapper[5184]: I0312 17:02:17.691935 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" event={"ID":"929a96db-d440-4095-bfed-bc35b90447eb","Type":"ContainerDied","Data":"6c52a65629d6a83860bb6864ccc9d8f9b44c88c09d70aa75e7902e09d53fea5d"} Mar 12 17:02:18 crc kubenswrapper[5184]: I0312 17:02:18.411219 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82e2099d-a6d8-488e-8144-b2ed728725e2" path="/var/lib/kubelet/pods/82e2099d-a6d8-488e-8144-b2ed728725e2/volumes" Mar 12 17:02:19 crc kubenswrapper[5184]: I0312 17:02:19.713460 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" event={"ID":"929a96db-d440-4095-bfed-bc35b90447eb","Type":"ContainerStarted","Data":"d9d4c22411890e2ec5e756267b7890e0cf16039168e4fe97ff037cf88e0c887c"} Mar 12 17:02:20 crc kubenswrapper[5184]: I0312 17:02:20.723672 5184 generic.go:358] "Generic (PLEG): container finished" podID="929a96db-d440-4095-bfed-bc35b90447eb" containerID="d9d4c22411890e2ec5e756267b7890e0cf16039168e4fe97ff037cf88e0c887c" exitCode=0 Mar 12 17:02:20 crc kubenswrapper[5184]: I0312 17:02:20.724079 5184 generic.go:358] "Generic (PLEG): container finished" podID="929a96db-d440-4095-bfed-bc35b90447eb" containerID="f83af0237fa73600139b6a5a4e31259d9cf114778dbd033406163803fcb9325a" exitCode=0 Mar 12 17:02:20 crc kubenswrapper[5184]: I0312 17:02:20.723780 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" event={"ID":"929a96db-d440-4095-bfed-bc35b90447eb","Type":"ContainerDied","Data":"d9d4c22411890e2ec5e756267b7890e0cf16039168e4fe97ff037cf88e0c887c"} Mar 12 17:02:20 crc kubenswrapper[5184]: I0312 17:02:20.724216 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" event={"ID":"929a96db-d440-4095-bfed-bc35b90447eb","Type":"ContainerDied","Data":"f83af0237fa73600139b6a5a4e31259d9cf114778dbd033406163803fcb9325a"} Mar 12 17:02:20 crc kubenswrapper[5184]: I0312 17:02:20.743213 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:02:20 crc kubenswrapper[5184]: I0312 17:02:20.743426 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.096089 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.126594 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqbnx\" (UniqueName: \"kubernetes.io/projected/929a96db-d440-4095-bfed-bc35b90447eb-kube-api-access-rqbnx\") pod \"929a96db-d440-4095-bfed-bc35b90447eb\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.126734 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-bundle\") pod \"929a96db-d440-4095-bfed-bc35b90447eb\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.128532 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-bundle" (OuterVolumeSpecName: "bundle") pod "929a96db-d440-4095-bfed-bc35b90447eb" (UID: "929a96db-d440-4095-bfed-bc35b90447eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.132945 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/929a96db-d440-4095-bfed-bc35b90447eb-kube-api-access-rqbnx" (OuterVolumeSpecName: "kube-api-access-rqbnx") pod "929a96db-d440-4095-bfed-bc35b90447eb" (UID: "929a96db-d440-4095-bfed-bc35b90447eb"). InnerVolumeSpecName "kube-api-access-rqbnx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.228417 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-util\") pod \"929a96db-d440-4095-bfed-bc35b90447eb\" (UID: \"929a96db-d440-4095-bfed-bc35b90447eb\") " Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.228731 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rqbnx\" (UniqueName: \"kubernetes.io/projected/929a96db-d440-4095-bfed-bc35b90447eb-kube-api-access-rqbnx\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.228763 5184 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.545178 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-util" (OuterVolumeSpecName: "util") pod "929a96db-d440-4095-bfed-bc35b90447eb" (UID: "929a96db-d440-4095-bfed-bc35b90447eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.635322 5184 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/929a96db-d440-4095-bfed-bc35b90447eb-util\") on node \"crc\" DevicePath \"\"" Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.743496 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" event={"ID":"929a96db-d440-4095-bfed-bc35b90447eb","Type":"ContainerDied","Data":"42dcd7d63355dab012993fe14181a076849101a7462a24414372261675c34c74"} Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.743554 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42dcd7d63355dab012993fe14181a076849101a7462a24414372261675c34c74" Mar 12 17:02:22 crc kubenswrapper[5184]: I0312 17:02:22.743622 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.674400 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf"] Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.675669 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="929a96db-d440-4095-bfed-bc35b90447eb" containerName="util" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.675690 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="929a96db-d440-4095-bfed-bc35b90447eb" containerName="util" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.675712 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="929a96db-d440-4095-bfed-bc35b90447eb" containerName="pull" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.675721 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="929a96db-d440-4095-bfed-bc35b90447eb" containerName="pull" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.675749 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="929a96db-d440-4095-bfed-bc35b90447eb" containerName="extract" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.675761 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="929a96db-d440-4095-bfed-bc35b90447eb" containerName="extract" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.675920 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="929a96db-d440-4095-bfed-bc35b90447eb" containerName="extract" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.681470 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.686208 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf"] Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.687441 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-nmstate\"/\"openshift-service-ca.crt\"" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.687812 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-nmstate\"/\"nmstate-operator-dockercfg-t8hm8\"" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.687613 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-nmstate\"/\"kube-root-ca.crt\"" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.834730 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt247\" (UniqueName: \"kubernetes.io/projected/614e8fcc-0a65-404c-a92c-b0a2834e4d92-kube-api-access-bt247\") pod \"nmstate-operator-54b58fcbc5-sdfcf\" (UID: \"614e8fcc-0a65-404c-a92c-b0a2834e4d92\") " pod="openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.936300 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bt247\" (UniqueName: \"kubernetes.io/projected/614e8fcc-0a65-404c-a92c-b0a2834e4d92-kube-api-access-bt247\") pod \"nmstate-operator-54b58fcbc5-sdfcf\" (UID: \"614e8fcc-0a65-404c-a92c-b0a2834e4d92\") " pod="openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf" Mar 12 17:02:27 crc kubenswrapper[5184]: I0312 17:02:27.970475 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt247\" (UniqueName: \"kubernetes.io/projected/614e8fcc-0a65-404c-a92c-b0a2834e4d92-kube-api-access-bt247\") pod \"nmstate-operator-54b58fcbc5-sdfcf\" (UID: \"614e8fcc-0a65-404c-a92c-b0a2834e4d92\") " pod="openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf" Mar 12 17:02:28 crc kubenswrapper[5184]: I0312 17:02:28.044076 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf" Mar 12 17:02:28 crc kubenswrapper[5184]: I0312 17:02:28.314836 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf"] Mar 12 17:02:28 crc kubenswrapper[5184]: I0312 17:02:28.793014 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf" event={"ID":"614e8fcc-0a65-404c-a92c-b0a2834e4d92","Type":"ContainerStarted","Data":"5d9483b435ab3f24d540897d16d5f481556e4b343be9e8c44cf3880cdf6e332c"} Mar 12 17:02:34 crc kubenswrapper[5184]: I0312 17:02:34.825687 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf" event={"ID":"614e8fcc-0a65-404c-a92c-b0a2834e4d92","Type":"ContainerStarted","Data":"b061c7aedacda174f73e358a69dfee2f9cf7a1f30917a34f074466b8e2c3842d"} Mar 12 17:02:34 crc kubenswrapper[5184]: I0312 17:02:34.827632 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf" Mar 12 17:02:34 crc kubenswrapper[5184]: I0312 17:02:34.850605 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf" podStartSLOduration=1.9495469029999999 podStartE2EDuration="7.850586751s" podCreationTimestamp="2026-03-12 17:02:27 +0000 UTC" firstStartedPulling="2026-03-12 17:02:28.322284946 +0000 UTC m=+690.863596285" lastFinishedPulling="2026-03-12 17:02:34.223324784 +0000 UTC m=+696.764636133" observedRunningTime="2026-03-12 17:02:34.849261329 +0000 UTC m=+697.390572668" watchObservedRunningTime="2026-03-12 17:02:34.850586751 +0000 UTC m=+697.391898100" Mar 12 17:02:45 crc kubenswrapper[5184]: I0312 17:02:45.841365 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-operator-54b58fcbc5-sdfcf" Mar 12 17:02:46 crc kubenswrapper[5184]: I0312 17:02:46.926539 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f888697b-qtghx"] Mar 12 17:02:46 crc kubenswrapper[5184]: I0312 17:02:46.992346 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f888697b-qtghx"] Mar 12 17:02:46 crc kubenswrapper[5184]: I0312 17:02:46.992432 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8"] Mar 12 17:02:46 crc kubenswrapper[5184]: I0312 17:02:46.993480 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" Mar 12 17:02:46 crc kubenswrapper[5184]: I0312 17:02:46.996998 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-288w4"] Mar 12 17:02:46 crc kubenswrapper[5184]: I0312 17:02:46.997132 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.001557 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-nmstate\"/\"nmstate-handler-dockercfg-j5lh4\"" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.001707 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-nmstate\"/\"openshift-nmstate-webhook\"" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.002021 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8"] Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.002125 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.095449 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg"] Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.099239 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.107821 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-nmstate\"/\"plugin-serving-cert\"" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.107877 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-nmstate\"/\"nginx-conf\"" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.108158 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-nmstate\"/\"default-dockercfg-md25h\"" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.113147 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg"] Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.119117 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9e52025d-b566-4965-8664-9d14f8f05dc6-tls-key-pair\") pod \"nmstate-webhook-78fdd78d8b-fxzg8\" (UID: \"9e52025d-b566-4965-8664-9d14f8f05dc6\") " pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.119166 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-ovs-socket\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.119234 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-dbus-socket\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.119303 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpsnw\" (UniqueName: \"kubernetes.io/projected/0aa67d09-261d-4bd2-8341-b81d6d2a3caa-kube-api-access-cpsnw\") pod \"nmstate-metrics-7f888697b-qtghx\" (UID: \"0aa67d09-261d-4bd2-8341-b81d6d2a3caa\") " pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.119323 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7rxn\" (UniqueName: \"kubernetes.io/projected/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-kube-api-access-n7rxn\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.119349 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx888\" (UniqueName: \"kubernetes.io/projected/9e52025d-b566-4965-8664-9d14f8f05dc6-kube-api-access-xx888\") pod \"nmstate-webhook-78fdd78d8b-fxzg8\" (UID: \"9e52025d-b566-4965-8664-9d14f8f05dc6\") " pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.119450 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-nmstate-lock\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.220844 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-nmstate-lock\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.220894 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9e52025d-b566-4965-8664-9d14f8f05dc6-tls-key-pair\") pod \"nmstate-webhook-78fdd78d8b-fxzg8\" (UID: \"9e52025d-b566-4965-8664-9d14f8f05dc6\") " pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.220919 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-ovs-socket\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.220951 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fc9b5b-64a1-410e-aa92-aec4333f8965-plugin-serving-cert\") pod \"nmstate-console-plugin-74686bb6b4-ljbdg\" (UID: \"b8fc9b5b-64a1-410e-aa92-aec4333f8965\") " pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.220995 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-dbus-socket\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.221077 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpsnw\" (UniqueName: \"kubernetes.io/projected/0aa67d09-261d-4bd2-8341-b81d6d2a3caa-kube-api-access-cpsnw\") pod \"nmstate-metrics-7f888697b-qtghx\" (UID: \"0aa67d09-261d-4bd2-8341-b81d6d2a3caa\") " pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.221102 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7rxn\" (UniqueName: \"kubernetes.io/projected/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-kube-api-access-n7rxn\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.221124 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8fc9b5b-64a1-410e-aa92-aec4333f8965-nginx-conf\") pod \"nmstate-console-plugin-74686bb6b4-ljbdg\" (UID: \"b8fc9b5b-64a1-410e-aa92-aec4333f8965\") " pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.221155 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx888\" (UniqueName: \"kubernetes.io/projected/9e52025d-b566-4965-8664-9d14f8f05dc6-kube-api-access-xx888\") pod \"nmstate-webhook-78fdd78d8b-fxzg8\" (UID: \"9e52025d-b566-4965-8664-9d14f8f05dc6\") " pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.221195 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m98c\" (UniqueName: \"kubernetes.io/projected/b8fc9b5b-64a1-410e-aa92-aec4333f8965-kube-api-access-4m98c\") pod \"nmstate-console-plugin-74686bb6b4-ljbdg\" (UID: \"b8fc9b5b-64a1-410e-aa92-aec4333f8965\") " pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.221567 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-dbus-socket\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.221613 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-nmstate-lock\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: E0312 17:02:47.221680 5184 secret.go:189] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 12 17:02:47 crc kubenswrapper[5184]: E0312 17:02:47.221740 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e52025d-b566-4965-8664-9d14f8f05dc6-tls-key-pair podName:9e52025d-b566-4965-8664-9d14f8f05dc6 nodeName:}" failed. No retries permitted until 2026-03-12 17:02:47.72172082 +0000 UTC m=+710.263032159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/9e52025d-b566-4965-8664-9d14f8f05dc6-tls-key-pair") pod "nmstate-webhook-78fdd78d8b-fxzg8" (UID: "9e52025d-b566-4965-8664-9d14f8f05dc6") : secret "openshift-nmstate-webhook" not found Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.221865 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-ovs-socket\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.250076 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7rxn\" (UniqueName: \"kubernetes.io/projected/89ad2c51-212e-4a3a-882d-f7c2aeb04a94-kube-api-access-n7rxn\") pod \"nmstate-handler-288w4\" (UID: \"89ad2c51-212e-4a3a-882d-f7c2aeb04a94\") " pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.252746 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpsnw\" (UniqueName: \"kubernetes.io/projected/0aa67d09-261d-4bd2-8341-b81d6d2a3caa-kube-api-access-cpsnw\") pod \"nmstate-metrics-7f888697b-qtghx\" (UID: \"0aa67d09-261d-4bd2-8341-b81d6d2a3caa\") " pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.253397 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx888\" (UniqueName: \"kubernetes.io/projected/9e52025d-b566-4965-8664-9d14f8f05dc6-kube-api-access-xx888\") pod \"nmstate-webhook-78fdd78d8b-fxzg8\" (UID: \"9e52025d-b566-4965-8664-9d14f8f05dc6\") " pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.288411 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-648f7d9d85-w2xjs"] Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.294721 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.298209 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-648f7d9d85-w2xjs"] Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.320582 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.322159 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fc9b5b-64a1-410e-aa92-aec4333f8965-plugin-serving-cert\") pod \"nmstate-console-plugin-74686bb6b4-ljbdg\" (UID: \"b8fc9b5b-64a1-410e-aa92-aec4333f8965\") " pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.322246 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8fc9b5b-64a1-410e-aa92-aec4333f8965-nginx-conf\") pod \"nmstate-console-plugin-74686bb6b4-ljbdg\" (UID: \"b8fc9b5b-64a1-410e-aa92-aec4333f8965\") " pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.322280 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m98c\" (UniqueName: \"kubernetes.io/projected/b8fc9b5b-64a1-410e-aa92-aec4333f8965-kube-api-access-4m98c\") pod \"nmstate-console-plugin-74686bb6b4-ljbdg\" (UID: \"b8fc9b5b-64a1-410e-aa92-aec4333f8965\") " pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: E0312 17:02:47.322305 5184 secret.go:189] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 12 17:02:47 crc kubenswrapper[5184]: E0312 17:02:47.322411 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8fc9b5b-64a1-410e-aa92-aec4333f8965-plugin-serving-cert podName:b8fc9b5b-64a1-410e-aa92-aec4333f8965 nodeName:}" failed. No retries permitted until 2026-03-12 17:02:47.822392299 +0000 UTC m=+710.363703638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/b8fc9b5b-64a1-410e-aa92-aec4333f8965-plugin-serving-cert") pod "nmstate-console-plugin-74686bb6b4-ljbdg" (UID: "b8fc9b5b-64a1-410e-aa92-aec4333f8965") : secret "plugin-serving-cert" not found Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.323206 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8fc9b5b-64a1-410e-aa92-aec4333f8965-nginx-conf\") pod \"nmstate-console-plugin-74686bb6b4-ljbdg\" (UID: \"b8fc9b5b-64a1-410e-aa92-aec4333f8965\") " pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.343854 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.347130 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m98c\" (UniqueName: \"kubernetes.io/projected/b8fc9b5b-64a1-410e-aa92-aec4333f8965-kube-api-access-4m98c\") pod \"nmstate-console-plugin-74686bb6b4-ljbdg\" (UID: \"b8fc9b5b-64a1-410e-aa92-aec4333f8965\") " pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.426622 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/117e471f-f53d-4df8-912c-e799e00fef4a-console-serving-cert\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.426679 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r9p4\" (UniqueName: \"kubernetes.io/projected/117e471f-f53d-4df8-912c-e799e00fef4a-kube-api-access-9r9p4\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.426728 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-service-ca\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.426746 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-trusted-ca-bundle\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.426760 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-console-config\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.426815 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-oauth-serving-cert\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.426854 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/117e471f-f53d-4df8-912c-e799e00fef4a-console-oauth-config\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.528002 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-oauth-serving-cert\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.528356 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/117e471f-f53d-4df8-912c-e799e00fef4a-console-oauth-config\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.528468 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/117e471f-f53d-4df8-912c-e799e00fef4a-console-serving-cert\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.528534 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r9p4\" (UniqueName: \"kubernetes.io/projected/117e471f-f53d-4df8-912c-e799e00fef4a-kube-api-access-9r9p4\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.528582 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-service-ca\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.528603 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-trusted-ca-bundle\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.528625 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-console-config\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.528852 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-oauth-serving-cert\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.529412 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-service-ca\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.529517 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-console-config\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.529771 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/117e471f-f53d-4df8-912c-e799e00fef4a-trusted-ca-bundle\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.533060 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/117e471f-f53d-4df8-912c-e799e00fef4a-console-oauth-config\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.535398 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/117e471f-f53d-4df8-912c-e799e00fef4a-console-serving-cert\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.547404 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r9p4\" (UniqueName: \"kubernetes.io/projected/117e471f-f53d-4df8-912c-e799e00fef4a-kube-api-access-9r9p4\") pod \"console-648f7d9d85-w2xjs\" (UID: \"117e471f-f53d-4df8-912c-e799e00fef4a\") " pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.609093 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.731989 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9e52025d-b566-4965-8664-9d14f8f05dc6-tls-key-pair\") pod \"nmstate-webhook-78fdd78d8b-fxzg8\" (UID: \"9e52025d-b566-4965-8664-9d14f8f05dc6\") " pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.736867 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9e52025d-b566-4965-8664-9d14f8f05dc6-tls-key-pair\") pod \"nmstate-webhook-78fdd78d8b-fxzg8\" (UID: \"9e52025d-b566-4965-8664-9d14f8f05dc6\") " pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.785266 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f888697b-qtghx"] Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.832997 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fc9b5b-64a1-410e-aa92-aec4333f8965-plugin-serving-cert\") pod \"nmstate-console-plugin-74686bb6b4-ljbdg\" (UID: \"b8fc9b5b-64a1-410e-aa92-aec4333f8965\") " pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.836607 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fc9b5b-64a1-410e-aa92-aec4333f8965-plugin-serving-cert\") pod \"nmstate-console-plugin-74686bb6b4-ljbdg\" (UID: \"b8fc9b5b-64a1-410e-aa92-aec4333f8965\") " pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.846424 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-648f7d9d85-w2xjs"] Mar 12 17:02:47 crc kubenswrapper[5184]: W0312 17:02:47.853250 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod117e471f_f53d_4df8_912c_e799e00fef4a.slice/crio-560d5bac55ad55d479d5c263ef0c439aa6f460f7dcf6aa5102686cbdc7d892a6 WatchSource:0}: Error finding container 560d5bac55ad55d479d5c263ef0c439aa6f460f7dcf6aa5102686cbdc7d892a6: Status 404 returned error can't find the container with id 560d5bac55ad55d479d5c263ef0c439aa6f460f7dcf6aa5102686cbdc7d892a6 Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.908576 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" event={"ID":"0aa67d09-261d-4bd2-8341-b81d6d2a3caa","Type":"ContainerStarted","Data":"64cb34bc6f9db0832107a9bbed7f7a1604e685313b5af04cbeaaf57104c5da96"} Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.910106 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-648f7d9d85-w2xjs" event={"ID":"117e471f-f53d-4df8-912c-e799e00fef4a","Type":"ContainerStarted","Data":"560d5bac55ad55d479d5c263ef0c439aa6f460f7dcf6aa5102686cbdc7d892a6"} Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.911013 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-288w4" event={"ID":"89ad2c51-212e-4a3a-882d-f7c2aeb04a94","Type":"ContainerStarted","Data":"202a4456c501aef97bea18484fac1f4008d72576cded5e1a173443f9ac82d41c"} Mar 12 17:02:47 crc kubenswrapper[5184]: I0312 17:02:47.931840 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:02:48 crc kubenswrapper[5184]: I0312 17:02:48.022614 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:48 crc kubenswrapper[5184]: I0312 17:02:48.120290 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8"] Mar 12 17:02:48 crc kubenswrapper[5184]: I0312 17:02:48.215748 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg"] Mar 12 17:02:48 crc kubenswrapper[5184]: W0312 17:02:48.225151 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8fc9b5b_64a1_410e_aa92_aec4333f8965.slice/crio-ae42671642304bfc656b94b5da0508a1e3da0532cd76c4b2a95f2d1ba002aec8 WatchSource:0}: Error finding container ae42671642304bfc656b94b5da0508a1e3da0532cd76c4b2a95f2d1ba002aec8: Status 404 returned error can't find the container with id ae42671642304bfc656b94b5da0508a1e3da0532cd76c4b2a95f2d1ba002aec8 Mar 12 17:02:48 crc kubenswrapper[5184]: I0312 17:02:48.924854 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" event={"ID":"9e52025d-b566-4965-8664-9d14f8f05dc6","Type":"ContainerStarted","Data":"b2e733f6a6be1191dd9763a14e5eb19cbacd837571a1e5171994100aaae53bb1"} Mar 12 17:02:48 crc kubenswrapper[5184]: I0312 17:02:48.925804 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" event={"ID":"b8fc9b5b-64a1-410e-aa92-aec4333f8965","Type":"ContainerStarted","Data":"ae42671642304bfc656b94b5da0508a1e3da0532cd76c4b2a95f2d1ba002aec8"} Mar 12 17:02:48 crc kubenswrapper[5184]: I0312 17:02:48.926929 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-648f7d9d85-w2xjs" event={"ID":"117e471f-f53d-4df8-912c-e799e00fef4a","Type":"ContainerStarted","Data":"7edf5fba9b439047a587b561989b0bb351a659fe0cd70780081eed044a88945a"} Mar 12 17:02:48 crc kubenswrapper[5184]: I0312 17:02:48.943829 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-648f7d9d85-w2xjs" podStartSLOduration=1.9437902120000001 podStartE2EDuration="1.943790212s" podCreationTimestamp="2026-03-12 17:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:02:48.943181833 +0000 UTC m=+711.484493192" watchObservedRunningTime="2026-03-12 17:02:48.943790212 +0000 UTC m=+711.485101561" Mar 12 17:02:50 crc kubenswrapper[5184]: I0312 17:02:50.742691 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:02:50 crc kubenswrapper[5184]: I0312 17:02:50.742774 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:02:50 crc kubenswrapper[5184]: I0312 17:02:50.742836 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 17:02:50 crc kubenswrapper[5184]: I0312 17:02:50.743538 5184 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6dcddf4c82a491a243d037b62a542200cd43f90af290d25abaab07cac5e2a61e"} pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 17:02:50 crc kubenswrapper[5184]: I0312 17:02:50.743610 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" containerID="cri-o://6dcddf4c82a491a243d037b62a542200cd43f90af290d25abaab07cac5e2a61e" gracePeriod=600 Mar 12 17:02:50 crc kubenswrapper[5184]: I0312 17:02:50.941325 5184 generic.go:358] "Generic (PLEG): container finished" podID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerID="6dcddf4c82a491a243d037b62a542200cd43f90af290d25abaab07cac5e2a61e" exitCode=0 Mar 12 17:02:50 crc kubenswrapper[5184]: I0312 17:02:50.941413 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerDied","Data":"6dcddf4c82a491a243d037b62a542200cd43f90af290d25abaab07cac5e2a61e"} Mar 12 17:02:50 crc kubenswrapper[5184]: I0312 17:02:50.941462 5184 scope.go:117] "RemoveContainer" containerID="7ce931eac957036c6c965318bd6ebe196835262d045725f0735bb9f9799bfd42" Mar 12 17:02:51 crc kubenswrapper[5184]: I0312 17:02:51.948274 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" event={"ID":"0aa67d09-261d-4bd2-8341-b81d6d2a3caa","Type":"ContainerStarted","Data":"e9a5ce61c2b0f9545b5364b07e4a6f1df36dbc40c9d4b7b062ef085ed5a21f05"} Mar 12 17:02:51 crc kubenswrapper[5184]: I0312 17:02:51.949778 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-288w4" event={"ID":"89ad2c51-212e-4a3a-882d-f7c2aeb04a94","Type":"ContainerStarted","Data":"32ebc14d82432b0b726bd4c3022f386aab49a75c022b783a7fbb91d2a2d1a4b7"} Mar 12 17:02:51 crc kubenswrapper[5184]: I0312 17:02:51.950612 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:02:51 crc kubenswrapper[5184]: I0312 17:02:51.953117 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" event={"ID":"9e52025d-b566-4965-8664-9d14f8f05dc6","Type":"ContainerStarted","Data":"1320e42876ab58e9d599c80da2c394089dc92c62a3292f5188a06f26b5e55ee9"} Mar 12 17:02:51 crc kubenswrapper[5184]: I0312 17:02:51.953197 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:02:51 crc kubenswrapper[5184]: I0312 17:02:51.954410 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" event={"ID":"b8fc9b5b-64a1-410e-aa92-aec4333f8965","Type":"ContainerStarted","Data":"19ec00b2940ee1fddf41c7a8b2665327225cf915b763540b9b0a89b819c11d9f"} Mar 12 17:02:51 crc kubenswrapper[5184]: I0312 17:02:51.954528 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:02:51 crc kubenswrapper[5184]: I0312 17:02:51.956795 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"3dd884d50ac06fbc873c6bc95140222a52ea0e09ed17b766f377daf94c2607fe"} Mar 12 17:02:51 crc kubenswrapper[5184]: I0312 17:02:51.974574 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-288w4" podStartSLOduration=2.207296425 podStartE2EDuration="5.974554703s" podCreationTimestamp="2026-03-12 17:02:46 +0000 UTC" firstStartedPulling="2026-03-12 17:02:47.379962842 +0000 UTC m=+709.921274181" lastFinishedPulling="2026-03-12 17:02:51.14722112 +0000 UTC m=+713.688532459" observedRunningTime="2026-03-12 17:02:51.969052198 +0000 UTC m=+714.510363547" watchObservedRunningTime="2026-03-12 17:02:51.974554703 +0000 UTC m=+714.515866052" Mar 12 17:02:51 crc kubenswrapper[5184]: I0312 17:02:51.987698 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" podStartSLOduration=2.985566594 podStartE2EDuration="5.987675368s" podCreationTimestamp="2026-03-12 17:02:46 +0000 UTC" firstStartedPulling="2026-03-12 17:02:48.13830521 +0000 UTC m=+710.679616559" lastFinishedPulling="2026-03-12 17:02:51.140413964 +0000 UTC m=+713.681725333" observedRunningTime="2026-03-12 17:02:51.983160725 +0000 UTC m=+714.524472074" watchObservedRunningTime="2026-03-12 17:02:51.987675368 +0000 UTC m=+714.528986707" Mar 12 17:02:52 crc kubenswrapper[5184]: I0312 17:02:52.018230 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" podStartSLOduration=2.1058455130000002 podStartE2EDuration="5.018186784s" podCreationTimestamp="2026-03-12 17:02:47 +0000 UTC" firstStartedPulling="2026-03-12 17:02:48.22767344 +0000 UTC m=+710.768984779" lastFinishedPulling="2026-03-12 17:02:51.140014711 +0000 UTC m=+713.681326050" observedRunningTime="2026-03-12 17:02:52.01741486 +0000 UTC m=+714.558726199" watchObservedRunningTime="2026-03-12 17:02:52.018186784 +0000 UTC m=+714.559498113" Mar 12 17:02:53 crc kubenswrapper[5184]: I0312 17:02:53.975194 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" event={"ID":"0aa67d09-261d-4bd2-8341-b81d6d2a3caa","Type":"ContainerStarted","Data":"7c332d961985afcff0505643fb898dc1a8e4eddf60a7919612f0a85d3dfc79c2"} Mar 12 17:02:53 crc kubenswrapper[5184]: I0312 17:02:53.999815 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" podStartSLOduration=1.9710821539999999 podStartE2EDuration="7.999782167s" podCreationTimestamp="2026-03-12 17:02:46 +0000 UTC" firstStartedPulling="2026-03-12 17:02:47.800028876 +0000 UTC m=+710.341340215" lastFinishedPulling="2026-03-12 17:02:53.828728889 +0000 UTC m=+716.370040228" observedRunningTime="2026-03-12 17:02:53.999516028 +0000 UTC m=+716.540827427" watchObservedRunningTime="2026-03-12 17:02:53.999782167 +0000 UTC m=+716.541093506" Mar 12 17:02:54 crc kubenswrapper[5184]: I0312 17:02:54.995765 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" Mar 12 17:02:57 crc kubenswrapper[5184]: I0312 17:02:57.609763 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:57 crc kubenswrapper[5184]: I0312 17:02:57.610587 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:57 crc kubenswrapper[5184]: I0312 17:02:57.617826 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:58 crc kubenswrapper[5184]: I0312 17:02:58.018072 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-648f7d9d85-w2xjs" Mar 12 17:02:58 crc kubenswrapper[5184]: I0312 17:02:58.074007 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d44f6ddf-qxthf"] Mar 12 17:02:58 crc kubenswrapper[5184]: I0312 17:02:58.859161 5184 scope.go:117] "RemoveContainer" containerID="092c63754a0d67a0781c9e74e1b6fbe0c12d4a215facb90bd7a50cd1b778b0e4" Mar 12 17:02:58 crc kubenswrapper[5184]: I0312 17:02:58.961364 5184 scope.go:117] "RemoveContainer" containerID="e32c6befe877e79950bc18dd6b9d090f73d6591d5932b62afd6e308cc060228f" Mar 12 17:02:59 crc kubenswrapper[5184]: I0312 17:02:59.005509 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-288w4" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.286091 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cpc7r"] Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.295702 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.306963 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cpc7r"] Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.436625 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-utilities\") pod \"redhat-operators-cpc7r\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.437181 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-catalog-content\") pod \"redhat-operators-cpc7r\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.437236 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn687\" (UniqueName: \"kubernetes.io/projected/2afa2256-21d3-4cba-881a-9a8414c36ea6-kube-api-access-wn687\") pod \"redhat-operators-cpc7r\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.538904 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-utilities\") pod \"redhat-operators-cpc7r\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.538962 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-catalog-content\") pod \"redhat-operators-cpc7r\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.538993 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn687\" (UniqueName: \"kubernetes.io/projected/2afa2256-21d3-4cba-881a-9a8414c36ea6-kube-api-access-wn687\") pod \"redhat-operators-cpc7r\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.539627 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-utilities\") pod \"redhat-operators-cpc7r\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.539654 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-catalog-content\") pod \"redhat-operators-cpc7r\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.566840 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn687\" (UniqueName: \"kubernetes.io/projected/2afa2256-21d3-4cba-881a-9a8414c36ea6-kube-api-access-wn687\") pod \"redhat-operators-cpc7r\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.625707 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.842726 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cpc7r"] Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.969706 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-console-plugin-74686bb6b4-ljbdg" Mar 12 17:03:02 crc kubenswrapper[5184]: I0312 17:03:02.973198 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-78fdd78d8b-fxzg8" Mar 12 17:03:03 crc kubenswrapper[5184]: I0312 17:03:03.049302 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpc7r" event={"ID":"2afa2256-21d3-4cba-881a-9a8414c36ea6","Type":"ContainerStarted","Data":"53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6"} Mar 12 17:03:03 crc kubenswrapper[5184]: I0312 17:03:03.049358 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpc7r" event={"ID":"2afa2256-21d3-4cba-881a-9a8414c36ea6","Type":"ContainerStarted","Data":"c8ee88a66ecf895f84e1276f08c2845ae00930a07cdbf8e6e634402619747671"} Mar 12 17:03:04 crc kubenswrapper[5184]: I0312 17:03:04.064573 5184 generic.go:358] "Generic (PLEG): container finished" podID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerID="53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6" exitCode=0 Mar 12 17:03:04 crc kubenswrapper[5184]: I0312 17:03:04.064726 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpc7r" event={"ID":"2afa2256-21d3-4cba-881a-9a8414c36ea6","Type":"ContainerDied","Data":"53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6"} Mar 12 17:03:06 crc kubenswrapper[5184]: I0312 17:03:06.003251 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-metrics-7f888697b-qtghx" Mar 12 17:03:06 crc kubenswrapper[5184]: I0312 17:03:06.103132 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpc7r" event={"ID":"2afa2256-21d3-4cba-881a-9a8414c36ea6","Type":"ContainerStarted","Data":"8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e"} Mar 12 17:03:07 crc kubenswrapper[5184]: I0312 17:03:07.112034 5184 generic.go:358] "Generic (PLEG): container finished" podID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerID="8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e" exitCode=0 Mar 12 17:03:07 crc kubenswrapper[5184]: I0312 17:03:07.112099 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpc7r" event={"ID":"2afa2256-21d3-4cba-881a-9a8414c36ea6","Type":"ContainerDied","Data":"8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e"} Mar 12 17:03:08 crc kubenswrapper[5184]: I0312 17:03:08.125716 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpc7r" event={"ID":"2afa2256-21d3-4cba-881a-9a8414c36ea6","Type":"ContainerStarted","Data":"2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c"} Mar 12 17:03:08 crc kubenswrapper[5184]: I0312 17:03:08.146111 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cpc7r" podStartSLOduration=4.530315375 podStartE2EDuration="6.14608523s" podCreationTimestamp="2026-03-12 17:03:02 +0000 UTC" firstStartedPulling="2026-03-12 17:03:04.066239772 +0000 UTC m=+726.607551151" lastFinishedPulling="2026-03-12 17:03:05.682009667 +0000 UTC m=+728.223321006" observedRunningTime="2026-03-12 17:03:08.145123829 +0000 UTC m=+730.686435188" watchObservedRunningTime="2026-03-12 17:03:08.14608523 +0000 UTC m=+730.687396569" Mar 12 17:03:12 crc kubenswrapper[5184]: I0312 17:03:12.626847 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:12 crc kubenswrapper[5184]: I0312 17:03:12.628169 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:13 crc kubenswrapper[5184]: I0312 17:03:13.694499 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cpc7r" podUID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerName="registry-server" probeResult="failure" output=< Mar 12 17:03:13 crc kubenswrapper[5184]: timeout: failed to connect service ":50051" within 1s Mar 12 17:03:13 crc kubenswrapper[5184]: > Mar 12 17:03:21 crc kubenswrapper[5184]: I0312 17:03:21.932103 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r"] Mar 12 17:03:21 crc kubenswrapper[5184]: I0312 17:03:21.954586 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r"] Mar 12 17:03:21 crc kubenswrapper[5184]: I0312 17:03:21.954756 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:21 crc kubenswrapper[5184]: I0312 17:03:21.959061 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2ccr\"" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.027523 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6rwg\" (UniqueName: \"kubernetes.io/projected/8cabbcda-f15b-4907-a210-ec5722d93f79-kube-api-access-m6rwg\") pod \"5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.027668 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-bundle\") pod \"5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.027725 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-util\") pod \"5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.129663 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rwg\" (UniqueName: \"kubernetes.io/projected/8cabbcda-f15b-4907-a210-ec5722d93f79-kube-api-access-m6rwg\") pod \"5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.129732 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-bundle\") pod \"5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.129770 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-util\") pod \"5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.130175 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-bundle\") pod \"5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.130251 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-util\") pod \"5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.157889 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rwg\" (UniqueName: \"kubernetes.io/projected/8cabbcda-f15b-4907-a210-ec5722d93f79-kube-api-access-m6rwg\") pod \"5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.273630 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.546883 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r"] Mar 12 17:03:22 crc kubenswrapper[5184]: W0312 17:03:22.552054 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cabbcda_f15b_4907_a210_ec5722d93f79.slice/crio-0417c9bb69ce04437b596378320d42c0119d746277c781ffeb6005b15a38df00 WatchSource:0}: Error finding container 0417c9bb69ce04437b596378320d42c0119d746277c781ffeb6005b15a38df00: Status 404 returned error can't find the container with id 0417c9bb69ce04437b596378320d42c0119d746277c781ffeb6005b15a38df00 Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.690983 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:22 crc kubenswrapper[5184]: I0312 17:03:22.739195 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.138803 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-console/console-64d44f6ddf-qxthf" podUID="c42a2703-d32e-41a7-accf-68b6e5d8c000" containerName="console" containerID="cri-o://86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932" gracePeriod=15 Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.252351 5184 generic.go:358] "Generic (PLEG): container finished" podID="8cabbcda-f15b-4907-a210-ec5722d93f79" containerID="147bf52c50a2927bd154667e5de08c1686285032bc73deb3c62403de2e4fb0b4" exitCode=0 Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.252756 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" event={"ID":"8cabbcda-f15b-4907-a210-ec5722d93f79","Type":"ContainerDied","Data":"147bf52c50a2927bd154667e5de08c1686285032bc73deb3c62403de2e4fb0b4"} Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.252841 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" event={"ID":"8cabbcda-f15b-4907-a210-ec5722d93f79","Type":"ContainerStarted","Data":"0417c9bb69ce04437b596378320d42c0119d746277c781ffeb6005b15a38df00"} Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.608835 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d44f6ddf-qxthf_c42a2703-d32e-41a7-accf-68b6e5d8c000/console/0.log" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.608930 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.755871 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-service-ca\") pod \"c42a2703-d32e-41a7-accf-68b6e5d8c000\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.755950 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xwwk\" (UniqueName: \"kubernetes.io/projected/c42a2703-d32e-41a7-accf-68b6e5d8c000-kube-api-access-8xwwk\") pod \"c42a2703-d32e-41a7-accf-68b6e5d8c000\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.756026 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-config\") pod \"c42a2703-d32e-41a7-accf-68b6e5d8c000\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.756084 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-oauth-serving-cert\") pod \"c42a2703-d32e-41a7-accf-68b6e5d8c000\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.756261 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-trusted-ca-bundle\") pod \"c42a2703-d32e-41a7-accf-68b6e5d8c000\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.756324 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-oauth-config\") pod \"c42a2703-d32e-41a7-accf-68b6e5d8c000\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.756410 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-serving-cert\") pod \"c42a2703-d32e-41a7-accf-68b6e5d8c000\" (UID: \"c42a2703-d32e-41a7-accf-68b6e5d8c000\") " Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.757246 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-service-ca" (OuterVolumeSpecName: "service-ca") pod "c42a2703-d32e-41a7-accf-68b6e5d8c000" (UID: "c42a2703-d32e-41a7-accf-68b6e5d8c000"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.757314 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c42a2703-d32e-41a7-accf-68b6e5d8c000" (UID: "c42a2703-d32e-41a7-accf-68b6e5d8c000"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.757447 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-config" (OuterVolumeSpecName: "console-config") pod "c42a2703-d32e-41a7-accf-68b6e5d8c000" (UID: "c42a2703-d32e-41a7-accf-68b6e5d8c000"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.757557 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c42a2703-d32e-41a7-accf-68b6e5d8c000" (UID: "c42a2703-d32e-41a7-accf-68b6e5d8c000"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.766309 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c42a2703-d32e-41a7-accf-68b6e5d8c000" (UID: "c42a2703-d32e-41a7-accf-68b6e5d8c000"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.766897 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c42a2703-d32e-41a7-accf-68b6e5d8c000" (UID: "c42a2703-d32e-41a7-accf-68b6e5d8c000"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.767341 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c42a2703-d32e-41a7-accf-68b6e5d8c000-kube-api-access-8xwwk" (OuterVolumeSpecName: "kube-api-access-8xwwk") pod "c42a2703-d32e-41a7-accf-68b6e5d8c000" (UID: "c42a2703-d32e-41a7-accf-68b6e5d8c000"). InnerVolumeSpecName "kube-api-access-8xwwk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.857794 5184 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.857853 5184 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.857901 5184 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.857918 5184 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.857936 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8xwwk\" (UniqueName: \"kubernetes.io/projected/c42a2703-d32e-41a7-accf-68b6e5d8c000-kube-api-access-8xwwk\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.857953 5184 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-console-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:23 crc kubenswrapper[5184]: I0312 17:03:23.857970 5184 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c42a2703-d32e-41a7-accf-68b6e5d8c000-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.252540 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cpc7r"] Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.263842 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d44f6ddf-qxthf_c42a2703-d32e-41a7-accf-68b6e5d8c000/console/0.log" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.263920 5184 generic.go:358] "Generic (PLEG): container finished" podID="c42a2703-d32e-41a7-accf-68b6e5d8c000" containerID="86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932" exitCode=2 Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.264062 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-qxthf" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.264124 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-qxthf" event={"ID":"c42a2703-d32e-41a7-accf-68b6e5d8c000","Type":"ContainerDied","Data":"86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932"} Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.264216 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-qxthf" event={"ID":"c42a2703-d32e-41a7-accf-68b6e5d8c000","Type":"ContainerDied","Data":"6a5e6e0a6ae46b3e434b251459a3375973490cec39b8820189bfa74a95465a90"} Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.264256 5184 scope.go:117] "RemoveContainer" containerID="86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.264833 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cpc7r" podUID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerName="registry-server" containerID="cri-o://2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c" gracePeriod=2 Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.287033 5184 scope.go:117] "RemoveContainer" containerID="86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932" Mar 12 17:03:24 crc kubenswrapper[5184]: E0312 17:03:24.287510 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932\": container with ID starting with 86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932 not found: ID does not exist" containerID="86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.287614 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932"} err="failed to get container status \"86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932\": rpc error: code = NotFound desc = could not find container \"86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932\": container with ID starting with 86b97c4075df4c157b75f64f46962f38af79bd66cd6d9bd3ec8b76413cb4c932 not found: ID does not exist" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.317105 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d44f6ddf-qxthf"] Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.325160 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64d44f6ddf-qxthf"] Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.407439 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c42a2703-d32e-41a7-accf-68b6e5d8c000" path="/var/lib/kubelet/pods/c42a2703-d32e-41a7-accf-68b6e5d8c000/volumes" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.701696 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.770060 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-catalog-content\") pod \"2afa2256-21d3-4cba-881a-9a8414c36ea6\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.770152 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-utilities\") pod \"2afa2256-21d3-4cba-881a-9a8414c36ea6\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.771523 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn687\" (UniqueName: \"kubernetes.io/projected/2afa2256-21d3-4cba-881a-9a8414c36ea6-kube-api-access-wn687\") pod \"2afa2256-21d3-4cba-881a-9a8414c36ea6\" (UID: \"2afa2256-21d3-4cba-881a-9a8414c36ea6\") " Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.771832 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-utilities" (OuterVolumeSpecName: "utilities") pod "2afa2256-21d3-4cba-881a-9a8414c36ea6" (UID: "2afa2256-21d3-4cba-881a-9a8414c36ea6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.778174 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2afa2256-21d3-4cba-881a-9a8414c36ea6-kube-api-access-wn687" (OuterVolumeSpecName: "kube-api-access-wn687") pod "2afa2256-21d3-4cba-881a-9a8414c36ea6" (UID: "2afa2256-21d3-4cba-881a-9a8414c36ea6"). InnerVolumeSpecName "kube-api-access-wn687". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.873878 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wn687\" (UniqueName: \"kubernetes.io/projected/2afa2256-21d3-4cba-881a-9a8414c36ea6-kube-api-access-wn687\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.873943 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.949397 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2afa2256-21d3-4cba-881a-9a8414c36ea6" (UID: "2afa2256-21d3-4cba-881a-9a8414c36ea6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:03:24 crc kubenswrapper[5184]: I0312 17:03:24.977190 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2afa2256-21d3-4cba-881a-9a8414c36ea6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.272009 5184 generic.go:358] "Generic (PLEG): container finished" podID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerID="2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c" exitCode=0 Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.272057 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpc7r" event={"ID":"2afa2256-21d3-4cba-881a-9a8414c36ea6","Type":"ContainerDied","Data":"2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c"} Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.272110 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpc7r" event={"ID":"2afa2256-21d3-4cba-881a-9a8414c36ea6","Type":"ContainerDied","Data":"c8ee88a66ecf895f84e1276f08c2845ae00930a07cdbf8e6e634402619747671"} Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.272115 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpc7r" Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.272133 5184 scope.go:117] "RemoveContainer" containerID="2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c" Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.292684 5184 scope.go:117] "RemoveContainer" containerID="8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e" Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.319747 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cpc7r"] Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.320082 5184 scope.go:117] "RemoveContainer" containerID="53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6" Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.325172 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cpc7r"] Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.343409 5184 scope.go:117] "RemoveContainer" containerID="2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c" Mar 12 17:03:25 crc kubenswrapper[5184]: E0312 17:03:25.344034 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c\": container with ID starting with 2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c not found: ID does not exist" containerID="2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c" Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.344071 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c"} err="failed to get container status \"2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c\": rpc error: code = NotFound desc = could not find container \"2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c\": container with ID starting with 2151494986f0ddfbdc8799cf8c2b2861d56a81ef68ba10b7bee1a92c328abe8c not found: ID does not exist" Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.344096 5184 scope.go:117] "RemoveContainer" containerID="8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e" Mar 12 17:03:25 crc kubenswrapper[5184]: E0312 17:03:25.344662 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e\": container with ID starting with 8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e not found: ID does not exist" containerID="8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e" Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.345085 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e"} err="failed to get container status \"8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e\": rpc error: code = NotFound desc = could not find container \"8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e\": container with ID starting with 8737a280288beafd7e14ad7a6d25a4244d7edb05c04ec49b22c6f360ac96412e not found: ID does not exist" Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.345265 5184 scope.go:117] "RemoveContainer" containerID="53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6" Mar 12 17:03:25 crc kubenswrapper[5184]: E0312 17:03:25.346029 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6\": container with ID starting with 53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6 not found: ID does not exist" containerID="53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6" Mar 12 17:03:25 crc kubenswrapper[5184]: I0312 17:03:25.346053 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6"} err="failed to get container status \"53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6\": rpc error: code = NotFound desc = could not find container \"53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6\": container with ID starting with 53cf7e548db204ea230473ecc2904114c8a0253719a4df853329b51296ec73e6 not found: ID does not exist" Mar 12 17:03:26 crc kubenswrapper[5184]: I0312 17:03:26.410795 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2afa2256-21d3-4cba-881a-9a8414c36ea6" path="/var/lib/kubelet/pods/2afa2256-21d3-4cba-881a-9a8414c36ea6/volumes" Mar 12 17:03:28 crc kubenswrapper[5184]: I0312 17:03:28.300185 5184 generic.go:358] "Generic (PLEG): container finished" podID="8cabbcda-f15b-4907-a210-ec5722d93f79" containerID="430b28d3762d96338e9025faa0aa8bfea2410850207e5d5f7f43915e0eb2d8a2" exitCode=0 Mar 12 17:03:28 crc kubenswrapper[5184]: I0312 17:03:28.300267 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" event={"ID":"8cabbcda-f15b-4907-a210-ec5722d93f79","Type":"ContainerDied","Data":"430b28d3762d96338e9025faa0aa8bfea2410850207e5d5f7f43915e0eb2d8a2"} Mar 12 17:03:29 crc kubenswrapper[5184]: I0312 17:03:29.311428 5184 generic.go:358] "Generic (PLEG): container finished" podID="8cabbcda-f15b-4907-a210-ec5722d93f79" containerID="01ffcbac271fc2d309c295cd05fd923b4da9d0b1839f554a9e14448207acf40b" exitCode=0 Mar 12 17:03:29 crc kubenswrapper[5184]: I0312 17:03:29.311585 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" event={"ID":"8cabbcda-f15b-4907-a210-ec5722d93f79","Type":"ContainerDied","Data":"01ffcbac271fc2d309c295cd05fd923b4da9d0b1839f554a9e14448207acf40b"} Mar 12 17:03:30 crc kubenswrapper[5184]: I0312 17:03:30.631547 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:30 crc kubenswrapper[5184]: I0312 17:03:30.754611 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-util\") pod \"8cabbcda-f15b-4907-a210-ec5722d93f79\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " Mar 12 17:03:30 crc kubenswrapper[5184]: I0312 17:03:30.754732 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6rwg\" (UniqueName: \"kubernetes.io/projected/8cabbcda-f15b-4907-a210-ec5722d93f79-kube-api-access-m6rwg\") pod \"8cabbcda-f15b-4907-a210-ec5722d93f79\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " Mar 12 17:03:30 crc kubenswrapper[5184]: I0312 17:03:30.754884 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-bundle\") pod \"8cabbcda-f15b-4907-a210-ec5722d93f79\" (UID: \"8cabbcda-f15b-4907-a210-ec5722d93f79\") " Mar 12 17:03:30 crc kubenswrapper[5184]: I0312 17:03:30.756356 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-bundle" (OuterVolumeSpecName: "bundle") pod "8cabbcda-f15b-4907-a210-ec5722d93f79" (UID: "8cabbcda-f15b-4907-a210-ec5722d93f79"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:03:30 crc kubenswrapper[5184]: I0312 17:03:30.763257 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cabbcda-f15b-4907-a210-ec5722d93f79-kube-api-access-m6rwg" (OuterVolumeSpecName: "kube-api-access-m6rwg") pod "8cabbcda-f15b-4907-a210-ec5722d93f79" (UID: "8cabbcda-f15b-4907-a210-ec5722d93f79"). InnerVolumeSpecName "kube-api-access-m6rwg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:03:30 crc kubenswrapper[5184]: I0312 17:03:30.780240 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-util" (OuterVolumeSpecName: "util") pod "8cabbcda-f15b-4907-a210-ec5722d93f79" (UID: "8cabbcda-f15b-4907-a210-ec5722d93f79"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:03:30 crc kubenswrapper[5184]: I0312 17:03:30.856311 5184 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-util\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:30 crc kubenswrapper[5184]: I0312 17:03:30.856427 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m6rwg\" (UniqueName: \"kubernetes.io/projected/8cabbcda-f15b-4907-a210-ec5722d93f79-kube-api-access-m6rwg\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:30 crc kubenswrapper[5184]: I0312 17:03:30.856442 5184 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cabbcda-f15b-4907-a210-ec5722d93f79-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:03:31 crc kubenswrapper[5184]: I0312 17:03:31.329814 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" event={"ID":"8cabbcda-f15b-4907-a210-ec5722d93f79","Type":"ContainerDied","Data":"0417c9bb69ce04437b596378320d42c0119d746277c781ffeb6005b15a38df00"} Mar 12 17:03:31 crc kubenswrapper[5184]: I0312 17:03:31.329886 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0417c9bb69ce04437b596378320d42c0119d746277c781ffeb6005b15a38df00" Mar 12 17:03:31 crc kubenswrapper[5184]: I0312 17:03:31.329852 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.174883 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx"] Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176182 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cabbcda-f15b-4907-a210-ec5722d93f79" containerName="extract" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176199 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cabbcda-f15b-4907-a210-ec5722d93f79" containerName="extract" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176218 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerName="registry-server" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176225 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerName="registry-server" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176237 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerName="extract-content" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176244 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerName="extract-content" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176258 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerName="extract-utilities" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176265 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerName="extract-utilities" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176281 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cabbcda-f15b-4907-a210-ec5722d93f79" containerName="pull" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176288 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cabbcda-f15b-4907-a210-ec5722d93f79" containerName="pull" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176300 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cabbcda-f15b-4907-a210-ec5722d93f79" containerName="util" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176307 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cabbcda-f15b-4907-a210-ec5722d93f79" containerName="util" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176364 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c42a2703-d32e-41a7-accf-68b6e5d8c000" containerName="console" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176391 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c42a2703-d32e-41a7-accf-68b6e5d8c000" containerName="console" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176511 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cabbcda-f15b-4907-a210-ec5722d93f79" containerName="extract" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176528 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="2afa2256-21d3-4cba-881a-9a8414c36ea6" containerName="registry-server" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.176539 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c42a2703-d32e-41a7-accf-68b6e5d8c000" containerName="console" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.184054 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.186250 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"metallb-system\"/\"kube-root-ca.crt\"" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.186458 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-operator-controller-manager-service-cert\"" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.186816 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"manager-account-dockercfg-jk4xs\"" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.187288 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"metallb-system\"/\"openshift-service-ca.crt\"" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.187498 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-operator-webhook-server-cert\"" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.197109 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx"] Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.266958 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6n88\" (UniqueName: \"kubernetes.io/projected/919cbf6b-6f23-47cb-897e-759c6ad20510-kube-api-access-t6n88\") pod \"metallb-operator-controller-manager-684cb6d9ff-8wmmx\" (UID: \"919cbf6b-6f23-47cb-897e-759c6ad20510\") " pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.267027 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/919cbf6b-6f23-47cb-897e-759c6ad20510-apiservice-cert\") pod \"metallb-operator-controller-manager-684cb6d9ff-8wmmx\" (UID: \"919cbf6b-6f23-47cb-897e-759c6ad20510\") " pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.267249 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/919cbf6b-6f23-47cb-897e-759c6ad20510-webhook-cert\") pod \"metallb-operator-controller-manager-684cb6d9ff-8wmmx\" (UID: \"919cbf6b-6f23-47cb-897e-759c6ad20510\") " pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.368662 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/919cbf6b-6f23-47cb-897e-759c6ad20510-webhook-cert\") pod \"metallb-operator-controller-manager-684cb6d9ff-8wmmx\" (UID: \"919cbf6b-6f23-47cb-897e-759c6ad20510\") " pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.368747 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6n88\" (UniqueName: \"kubernetes.io/projected/919cbf6b-6f23-47cb-897e-759c6ad20510-kube-api-access-t6n88\") pod \"metallb-operator-controller-manager-684cb6d9ff-8wmmx\" (UID: \"919cbf6b-6f23-47cb-897e-759c6ad20510\") " pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.368811 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/919cbf6b-6f23-47cb-897e-759c6ad20510-apiservice-cert\") pod \"metallb-operator-controller-manager-684cb6d9ff-8wmmx\" (UID: \"919cbf6b-6f23-47cb-897e-759c6ad20510\") " pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.389024 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/919cbf6b-6f23-47cb-897e-759c6ad20510-webhook-cert\") pod \"metallb-operator-controller-manager-684cb6d9ff-8wmmx\" (UID: \"919cbf6b-6f23-47cb-897e-759c6ad20510\") " pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.389035 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/919cbf6b-6f23-47cb-897e-759c6ad20510-apiservice-cert\") pod \"metallb-operator-controller-manager-684cb6d9ff-8wmmx\" (UID: \"919cbf6b-6f23-47cb-897e-759c6ad20510\") " pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.390854 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6n88\" (UniqueName: \"kubernetes.io/projected/919cbf6b-6f23-47cb-897e-759c6ad20510-kube-api-access-t6n88\") pod \"metallb-operator-controller-manager-684cb6d9ff-8wmmx\" (UID: \"919cbf6b-6f23-47cb-897e-759c6ad20510\") " pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.498714 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.510099 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v"] Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.515787 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.518552 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-webhook-cert\"" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.519019 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-operator-webhook-server-service-cert\"" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.520965 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"controller-dockercfg-5rl9t\"" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.541711 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v"] Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.672888 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q6nt\" (UniqueName: \"kubernetes.io/projected/693b4de7-38e7-4b99-8e08-85943be754aa-kube-api-access-4q6nt\") pod \"metallb-operator-webhook-server-6ccc8dfc97-z8s4v\" (UID: \"693b4de7-38e7-4b99-8e08-85943be754aa\") " pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.677746 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/693b4de7-38e7-4b99-8e08-85943be754aa-webhook-cert\") pod \"metallb-operator-webhook-server-6ccc8dfc97-z8s4v\" (UID: \"693b4de7-38e7-4b99-8e08-85943be754aa\") " pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.677782 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/693b4de7-38e7-4b99-8e08-85943be754aa-apiservice-cert\") pod \"metallb-operator-webhook-server-6ccc8dfc97-z8s4v\" (UID: \"693b4de7-38e7-4b99-8e08-85943be754aa\") " pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.778921 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4q6nt\" (UniqueName: \"kubernetes.io/projected/693b4de7-38e7-4b99-8e08-85943be754aa-kube-api-access-4q6nt\") pod \"metallb-operator-webhook-server-6ccc8dfc97-z8s4v\" (UID: \"693b4de7-38e7-4b99-8e08-85943be754aa\") " pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.779041 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/693b4de7-38e7-4b99-8e08-85943be754aa-webhook-cert\") pod \"metallb-operator-webhook-server-6ccc8dfc97-z8s4v\" (UID: \"693b4de7-38e7-4b99-8e08-85943be754aa\") " pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.779073 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/693b4de7-38e7-4b99-8e08-85943be754aa-apiservice-cert\") pod \"metallb-operator-webhook-server-6ccc8dfc97-z8s4v\" (UID: \"693b4de7-38e7-4b99-8e08-85943be754aa\") " pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.785995 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/693b4de7-38e7-4b99-8e08-85943be754aa-apiservice-cert\") pod \"metallb-operator-webhook-server-6ccc8dfc97-z8s4v\" (UID: \"693b4de7-38e7-4b99-8e08-85943be754aa\") " pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.786000 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/693b4de7-38e7-4b99-8e08-85943be754aa-webhook-cert\") pod \"metallb-operator-webhook-server-6ccc8dfc97-z8s4v\" (UID: \"693b4de7-38e7-4b99-8e08-85943be754aa\") " pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.794718 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q6nt\" (UniqueName: \"kubernetes.io/projected/693b4de7-38e7-4b99-8e08-85943be754aa-kube-api-access-4q6nt\") pod \"metallb-operator-webhook-server-6ccc8dfc97-z8s4v\" (UID: \"693b4de7-38e7-4b99-8e08-85943be754aa\") " pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:39 crc kubenswrapper[5184]: I0312 17:03:39.910815 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:40 crc kubenswrapper[5184]: I0312 17:03:40.012127 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx"] Mar 12 17:03:40 crc kubenswrapper[5184]: W0312 17:03:40.161669 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod693b4de7_38e7_4b99_8e08_85943be754aa.slice/crio-d721687ea2e7dd60ed5185f19d1a53b721f73b89ff1a7cae0ef1a6c466f55f80 WatchSource:0}: Error finding container d721687ea2e7dd60ed5185f19d1a53b721f73b89ff1a7cae0ef1a6c466f55f80: Status 404 returned error can't find the container with id d721687ea2e7dd60ed5185f19d1a53b721f73b89ff1a7cae0ef1a6c466f55f80 Mar 12 17:03:40 crc kubenswrapper[5184]: I0312 17:03:40.162360 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v"] Mar 12 17:03:40 crc kubenswrapper[5184]: I0312 17:03:40.382656 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" event={"ID":"919cbf6b-6f23-47cb-897e-759c6ad20510","Type":"ContainerStarted","Data":"2824a42c1306b18a78c04a168e9aba152fd22e92e5d745170a6f3738b5a47dd9"} Mar 12 17:03:40 crc kubenswrapper[5184]: I0312 17:03:40.383638 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" event={"ID":"693b4de7-38e7-4b99-8e08-85943be754aa","Type":"ContainerStarted","Data":"d721687ea2e7dd60ed5185f19d1a53b721f73b89ff1a7cae0ef1a6c466f55f80"} Mar 12 17:03:46 crc kubenswrapper[5184]: I0312 17:03:46.423666 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" event={"ID":"919cbf6b-6f23-47cb-897e-759c6ad20510","Type":"ContainerStarted","Data":"26a8a1dd5f890b552904bfb27a3c21454e0acaacbd802bdc58408b4883d90b21"} Mar 12 17:03:46 crc kubenswrapper[5184]: I0312 17:03:46.424509 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:03:46 crc kubenswrapper[5184]: I0312 17:03:46.427168 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" event={"ID":"693b4de7-38e7-4b99-8e08-85943be754aa","Type":"ContainerStarted","Data":"332f4436d566b8ad49bb2c5617e85b8c7332f2cd501551a155edcd3aeb9dae1f"} Mar 12 17:03:46 crc kubenswrapper[5184]: I0312 17:03:46.427367 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:46 crc kubenswrapper[5184]: I0312 17:03:46.451712 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" podStartSLOduration=1.973011225 podStartE2EDuration="7.451682777s" podCreationTimestamp="2026-03-12 17:03:39 +0000 UTC" firstStartedPulling="2026-03-12 17:03:40.027766888 +0000 UTC m=+762.569078227" lastFinishedPulling="2026-03-12 17:03:45.50643844 +0000 UTC m=+768.047749779" observedRunningTime="2026-03-12 17:03:46.445731698 +0000 UTC m=+768.987043097" watchObservedRunningTime="2026-03-12 17:03:46.451682777 +0000 UTC m=+768.992994156" Mar 12 17:03:46 crc kubenswrapper[5184]: I0312 17:03:46.477339 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" podStartSLOduration=2.128139739 podStartE2EDuration="7.477317859s" podCreationTimestamp="2026-03-12 17:03:39 +0000 UTC" firstStartedPulling="2026-03-12 17:03:40.164791458 +0000 UTC m=+762.706102807" lastFinishedPulling="2026-03-12 17:03:45.513969588 +0000 UTC m=+768.055280927" observedRunningTime="2026-03-12 17:03:46.474253062 +0000 UTC m=+769.015564491" watchObservedRunningTime="2026-03-12 17:03:46.477317859 +0000 UTC m=+769.018629198" Mar 12 17:03:57 crc kubenswrapper[5184]: I0312 17:03:57.445730 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6ccc8dfc97-z8s4v" Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.549940 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v46p7"] Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.742318 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v46p7"] Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.742550 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.813716 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-catalog-content\") pod \"redhat-marketplace-v46p7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.814111 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-utilities\") pod \"redhat-marketplace-v46p7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.814163 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2pcq\" (UniqueName: \"kubernetes.io/projected/c312210c-6026-4b62-ab00-83386f1f1fa7-kube-api-access-n2pcq\") pod \"redhat-marketplace-v46p7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.915471 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-utilities\") pod \"redhat-marketplace-v46p7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.915528 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2pcq\" (UniqueName: \"kubernetes.io/projected/c312210c-6026-4b62-ab00-83386f1f1fa7-kube-api-access-n2pcq\") pod \"redhat-marketplace-v46p7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.915583 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-catalog-content\") pod \"redhat-marketplace-v46p7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.915981 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-catalog-content\") pod \"redhat-marketplace-v46p7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.916183 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-utilities\") pod \"redhat-marketplace-v46p7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:58 crc kubenswrapper[5184]: I0312 17:03:58.938785 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2pcq\" (UniqueName: \"kubernetes.io/projected/c312210c-6026-4b62-ab00-83386f1f1fa7-kube-api-access-n2pcq\") pod \"redhat-marketplace-v46p7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:59 crc kubenswrapper[5184]: I0312 17:03:59.060505 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:03:59 crc kubenswrapper[5184]: I0312 17:03:59.505218 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v46p7"] Mar 12 17:03:59 crc kubenswrapper[5184]: I0312 17:03:59.526993 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v46p7" event={"ID":"c312210c-6026-4b62-ab00-83386f1f1fa7","Type":"ContainerStarted","Data":"c8ba4a187da28a3ba0eb161800f7259a325f54ec0af947ae8fb9b91662109125"} Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.128143 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555584-82kk8"] Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.133137 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555584-82kk8" Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.135198 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.135251 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.136059 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.143478 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555584-82kk8"] Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.231026 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrrxh\" (UniqueName: \"kubernetes.io/projected/75d0157e-15d0-42fc-a50e-1b5688578404-kube-api-access-qrrxh\") pod \"auto-csr-approver-29555584-82kk8\" (UID: \"75d0157e-15d0-42fc-a50e-1b5688578404\") " pod="openshift-infra/auto-csr-approver-29555584-82kk8" Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.332586 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrrxh\" (UniqueName: \"kubernetes.io/projected/75d0157e-15d0-42fc-a50e-1b5688578404-kube-api-access-qrrxh\") pod \"auto-csr-approver-29555584-82kk8\" (UID: \"75d0157e-15d0-42fc-a50e-1b5688578404\") " pod="openshift-infra/auto-csr-approver-29555584-82kk8" Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.366368 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrrxh\" (UniqueName: \"kubernetes.io/projected/75d0157e-15d0-42fc-a50e-1b5688578404-kube-api-access-qrrxh\") pod \"auto-csr-approver-29555584-82kk8\" (UID: \"75d0157e-15d0-42fc-a50e-1b5688578404\") " pod="openshift-infra/auto-csr-approver-29555584-82kk8" Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.457388 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555584-82kk8" Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.535620 5184 generic.go:358] "Generic (PLEG): container finished" podID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerID="e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024" exitCode=0 Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.535665 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v46p7" event={"ID":"c312210c-6026-4b62-ab00-83386f1f1fa7","Type":"ContainerDied","Data":"e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024"} Mar 12 17:04:00 crc kubenswrapper[5184]: W0312 17:04:00.686098 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75d0157e_15d0_42fc_a50e_1b5688578404.slice/crio-9a42c9b2c8ebc5af8a99c9db51df4d442485b7ae88afd9f43dbc050e98b96c9b WatchSource:0}: Error finding container 9a42c9b2c8ebc5af8a99c9db51df4d442485b7ae88afd9f43dbc050e98b96c9b: Status 404 returned error can't find the container with id 9a42c9b2c8ebc5af8a99c9db51df4d442485b7ae88afd9f43dbc050e98b96c9b Mar 12 17:04:00 crc kubenswrapper[5184]: I0312 17:04:00.688403 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555584-82kk8"] Mar 12 17:04:01 crc kubenswrapper[5184]: I0312 17:04:01.542289 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555584-82kk8" event={"ID":"75d0157e-15d0-42fc-a50e-1b5688578404","Type":"ContainerStarted","Data":"9a42c9b2c8ebc5af8a99c9db51df4d442485b7ae88afd9f43dbc050e98b96c9b"} Mar 12 17:04:01 crc kubenswrapper[5184]: I0312 17:04:01.543917 5184 generic.go:358] "Generic (PLEG): container finished" podID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerID="d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0" exitCode=0 Mar 12 17:04:01 crc kubenswrapper[5184]: I0312 17:04:01.544028 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v46p7" event={"ID":"c312210c-6026-4b62-ab00-83386f1f1fa7","Type":"ContainerDied","Data":"d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0"} Mar 12 17:04:02 crc kubenswrapper[5184]: I0312 17:04:02.552050 5184 generic.go:358] "Generic (PLEG): container finished" podID="75d0157e-15d0-42fc-a50e-1b5688578404" containerID="bf586c7654695d0f359c042740b6e675c9b3800d76d81f9cc06c09a27af9c298" exitCode=0 Mar 12 17:04:02 crc kubenswrapper[5184]: I0312 17:04:02.552158 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555584-82kk8" event={"ID":"75d0157e-15d0-42fc-a50e-1b5688578404","Type":"ContainerDied","Data":"bf586c7654695d0f359c042740b6e675c9b3800d76d81f9cc06c09a27af9c298"} Mar 12 17:04:02 crc kubenswrapper[5184]: I0312 17:04:02.554908 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v46p7" event={"ID":"c312210c-6026-4b62-ab00-83386f1f1fa7","Type":"ContainerStarted","Data":"3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f"} Mar 12 17:04:02 crc kubenswrapper[5184]: I0312 17:04:02.579230 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v46p7" podStartSLOduration=4.031437429 podStartE2EDuration="4.579205337s" podCreationTimestamp="2026-03-12 17:03:58 +0000 UTC" firstStartedPulling="2026-03-12 17:04:00.536412698 +0000 UTC m=+783.077724037" lastFinishedPulling="2026-03-12 17:04:01.084180606 +0000 UTC m=+783.625491945" observedRunningTime="2026-03-12 17:04:02.576624515 +0000 UTC m=+785.117935854" watchObservedRunningTime="2026-03-12 17:04:02.579205337 +0000 UTC m=+785.120516706" Mar 12 17:04:03 crc kubenswrapper[5184]: I0312 17:04:03.901701 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555584-82kk8" Mar 12 17:04:03 crc kubenswrapper[5184]: I0312 17:04:03.983806 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrrxh\" (UniqueName: \"kubernetes.io/projected/75d0157e-15d0-42fc-a50e-1b5688578404-kube-api-access-qrrxh\") pod \"75d0157e-15d0-42fc-a50e-1b5688578404\" (UID: \"75d0157e-15d0-42fc-a50e-1b5688578404\") " Mar 12 17:04:03 crc kubenswrapper[5184]: I0312 17:04:03.989349 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d0157e-15d0-42fc-a50e-1b5688578404-kube-api-access-qrrxh" (OuterVolumeSpecName: "kube-api-access-qrrxh") pod "75d0157e-15d0-42fc-a50e-1b5688578404" (UID: "75d0157e-15d0-42fc-a50e-1b5688578404"). InnerVolumeSpecName "kube-api-access-qrrxh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:04:04 crc kubenswrapper[5184]: I0312 17:04:04.085957 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrrxh\" (UniqueName: \"kubernetes.io/projected/75d0157e-15d0-42fc-a50e-1b5688578404-kube-api-access-qrrxh\") on node \"crc\" DevicePath \"\"" Mar 12 17:04:04 crc kubenswrapper[5184]: I0312 17:04:04.567196 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555584-82kk8" event={"ID":"75d0157e-15d0-42fc-a50e-1b5688578404","Type":"ContainerDied","Data":"9a42c9b2c8ebc5af8a99c9db51df4d442485b7ae88afd9f43dbc050e98b96c9b"} Mar 12 17:04:04 crc kubenswrapper[5184]: I0312 17:04:04.567244 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a42c9b2c8ebc5af8a99c9db51df4d442485b7ae88afd9f43dbc050e98b96c9b" Mar 12 17:04:04 crc kubenswrapper[5184]: I0312 17:04:04.567207 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555584-82kk8" Mar 12 17:04:04 crc kubenswrapper[5184]: I0312 17:04:04.963190 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555578-7hxkg"] Mar 12 17:04:04 crc kubenswrapper[5184]: I0312 17:04:04.973449 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555578-7hxkg"] Mar 12 17:04:06 crc kubenswrapper[5184]: I0312 17:04:06.411858 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eba40a8-87c3-43ee-88bf-502b11e90d37" path="/var/lib/kubelet/pods/4eba40a8-87c3-43ee-88bf-502b11e90d37/volumes" Mar 12 17:04:09 crc kubenswrapper[5184]: I0312 17:04:09.060972 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:04:09 crc kubenswrapper[5184]: I0312 17:04:09.061404 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:04:09 crc kubenswrapper[5184]: I0312 17:04:09.112878 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:04:09 crc kubenswrapper[5184]: I0312 17:04:09.653853 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:04:11 crc kubenswrapper[5184]: I0312 17:04:11.539435 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v46p7"] Mar 12 17:04:11 crc kubenswrapper[5184]: I0312 17:04:11.616009 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v46p7" podUID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerName="registry-server" containerID="cri-o://3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f" gracePeriod=2 Mar 12 17:04:11 crc kubenswrapper[5184]: I0312 17:04:11.983604 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.089731 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-catalog-content\") pod \"c312210c-6026-4b62-ab00-83386f1f1fa7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.095412 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2pcq\" (UniqueName: \"kubernetes.io/projected/c312210c-6026-4b62-ab00-83386f1f1fa7-kube-api-access-n2pcq\") pod \"c312210c-6026-4b62-ab00-83386f1f1fa7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.095511 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-utilities\") pod \"c312210c-6026-4b62-ab00-83386f1f1fa7\" (UID: \"c312210c-6026-4b62-ab00-83386f1f1fa7\") " Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.098678 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-utilities" (OuterVolumeSpecName: "utilities") pod "c312210c-6026-4b62-ab00-83386f1f1fa7" (UID: "c312210c-6026-4b62-ab00-83386f1f1fa7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.106511 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c312210c-6026-4b62-ab00-83386f1f1fa7-kube-api-access-n2pcq" (OuterVolumeSpecName: "kube-api-access-n2pcq") pod "c312210c-6026-4b62-ab00-83386f1f1fa7" (UID: "c312210c-6026-4b62-ab00-83386f1f1fa7"). InnerVolumeSpecName "kube-api-access-n2pcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.126980 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c312210c-6026-4b62-ab00-83386f1f1fa7" (UID: "c312210c-6026-4b62-ab00-83386f1f1fa7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.197803 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.197849 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2pcq\" (UniqueName: \"kubernetes.io/projected/c312210c-6026-4b62-ab00-83386f1f1fa7-kube-api-access-n2pcq\") on node \"crc\" DevicePath \"\"" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.197870 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c312210c-6026-4b62-ab00-83386f1f1fa7-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.627851 5184 generic.go:358] "Generic (PLEG): container finished" podID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerID="3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f" exitCode=0 Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.627932 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v46p7" event={"ID":"c312210c-6026-4b62-ab00-83386f1f1fa7","Type":"ContainerDied","Data":"3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f"} Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.628031 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v46p7" event={"ID":"c312210c-6026-4b62-ab00-83386f1f1fa7","Type":"ContainerDied","Data":"c8ba4a187da28a3ba0eb161800f7259a325f54ec0af947ae8fb9b91662109125"} Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.628065 5184 scope.go:117] "RemoveContainer" containerID="3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.627956 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v46p7" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.653880 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v46p7"] Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.658745 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v46p7"] Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.659773 5184 scope.go:117] "RemoveContainer" containerID="d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.686036 5184 scope.go:117] "RemoveContainer" containerID="e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.712239 5184 scope.go:117] "RemoveContainer" containerID="3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f" Mar 12 17:04:12 crc kubenswrapper[5184]: E0312 17:04:12.712768 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f\": container with ID starting with 3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f not found: ID does not exist" containerID="3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.712875 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f"} err="failed to get container status \"3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f\": rpc error: code = NotFound desc = could not find container \"3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f\": container with ID starting with 3f9961611753d62f2906dea46bda595c6a5f1af24746c49b511f3e809f6f5c8f not found: ID does not exist" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.713005 5184 scope.go:117] "RemoveContainer" containerID="d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0" Mar 12 17:04:12 crc kubenswrapper[5184]: E0312 17:04:12.713761 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0\": container with ID starting with d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0 not found: ID does not exist" containerID="d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.713794 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0"} err="failed to get container status \"d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0\": rpc error: code = NotFound desc = could not find container \"d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0\": container with ID starting with d0cb9807f3c9af5f916b6ae7159616cba9bd7a5616c9c7d1c95530d6756d3cf0 not found: ID does not exist" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.713816 5184 scope.go:117] "RemoveContainer" containerID="e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024" Mar 12 17:04:12 crc kubenswrapper[5184]: E0312 17:04:12.714078 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024\": container with ID starting with e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024 not found: ID does not exist" containerID="e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024" Mar 12 17:04:12 crc kubenswrapper[5184]: I0312 17:04:12.714103 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024"} err="failed to get container status \"e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024\": rpc error: code = NotFound desc = could not find container \"e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024\": container with ID starting with e87ca5ec7af3ded8051915cd5932dee536b7b7afa5cac3ec9b0167ba84860024 not found: ID does not exist" Mar 12 17:04:14 crc kubenswrapper[5184]: I0312 17:04:14.408090 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c312210c-6026-4b62-ab00-83386f1f1fa7" path="/var/lib/kubelet/pods/c312210c-6026-4b62-ab00-83386f1f1fa7/volumes" Mar 12 17:04:17 crc kubenswrapper[5184]: I0312 17:04:17.443351 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-684cb6d9ff-8wmmx" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.492708 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5ktvq"] Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.493860 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerName="extract-content" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.493879 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerName="extract-content" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.493897 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75d0157e-15d0-42fc-a50e-1b5688578404" containerName="oc" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.493906 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d0157e-15d0-42fc-a50e-1b5688578404" containerName="oc" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.493920 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerName="registry-server" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.493928 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerName="registry-server" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.493958 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerName="extract-utilities" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.493966 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerName="extract-utilities" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.494082 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c312210c-6026-4b62-ab00-83386f1f1fa7" containerName="registry-server" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.494095 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="75d0157e-15d0-42fc-a50e-1b5688578404" containerName="oc" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.504418 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["metallb-system/controller-774d88f846-k2qfn"] Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.504670 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.507905 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"speaker-certs-secret\"" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.508899 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-memberlist\"" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.509089 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.509122 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"speaker-dockercfg-2ftwg\"" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.510355 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"metallb-system\"/\"metallb-excludel2\"" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.511150 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"controller-certs-secret\"" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.519192 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-774d88f846-k2qfn"] Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.578607 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-memberlist\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.578649 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/adae6bbe-80fb-4692-b73f-402356ce10c4-metallb-excludel2\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.578673 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5sl2\" (UniqueName: \"kubernetes.io/projected/adae6bbe-80fb-4692-b73f-402356ce10c4-kube-api-access-l5sl2\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.578709 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrmcz\" (UniqueName: \"kubernetes.io/projected/93ee07df-1490-4be4-97d0-f3f5b20ceb90-kube-api-access-rrmcz\") pod \"controller-774d88f846-k2qfn\" (UID: \"93ee07df-1490-4be4-97d0-f3f5b20ceb90\") " pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.578811 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-metrics-certs\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.578934 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93ee07df-1490-4be4-97d0-f3f5b20ceb90-metrics-certs\") pod \"controller-774d88f846-k2qfn\" (UID: \"93ee07df-1490-4be4-97d0-f3f5b20ceb90\") " pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.579000 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93ee07df-1490-4be4-97d0-f3f5b20ceb90-cert\") pod \"controller-774d88f846-k2qfn\" (UID: \"93ee07df-1490-4be4-97d0-f3f5b20ceb90\") " pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.680222 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-memberlist\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.680261 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/adae6bbe-80fb-4692-b73f-402356ce10c4-metallb-excludel2\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.680285 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5sl2\" (UniqueName: \"kubernetes.io/projected/adae6bbe-80fb-4692-b73f-402356ce10c4-kube-api-access-l5sl2\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.680300 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrmcz\" (UniqueName: \"kubernetes.io/projected/93ee07df-1490-4be4-97d0-f3f5b20ceb90-kube-api-access-rrmcz\") pod \"controller-774d88f846-k2qfn\" (UID: \"93ee07df-1490-4be4-97d0-f3f5b20ceb90\") " pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:18 crc kubenswrapper[5184]: E0312 17:04:18.680447 5184 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.680455 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-metrics-certs\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: E0312 17:04:18.680545 5184 secret.go:189] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 12 17:04:18 crc kubenswrapper[5184]: E0312 17:04:18.680547 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-memberlist podName:adae6bbe-80fb-4692-b73f-402356ce10c4 nodeName:}" failed. No retries permitted until 2026-03-12 17:04:19.180521801 +0000 UTC m=+801.721833150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-memberlist") pod "speaker-5ktvq" (UID: "adae6bbe-80fb-4692-b73f-402356ce10c4") : secret "metallb-memberlist" not found Mar 12 17:04:18 crc kubenswrapper[5184]: E0312 17:04:18.680718 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-metrics-certs podName:adae6bbe-80fb-4692-b73f-402356ce10c4 nodeName:}" failed. No retries permitted until 2026-03-12 17:04:19.180697396 +0000 UTC m=+801.722008735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-metrics-certs") pod "speaker-5ktvq" (UID: "adae6bbe-80fb-4692-b73f-402356ce10c4") : secret "speaker-certs-secret" not found Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.680776 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93ee07df-1490-4be4-97d0-f3f5b20ceb90-metrics-certs\") pod \"controller-774d88f846-k2qfn\" (UID: \"93ee07df-1490-4be4-97d0-f3f5b20ceb90\") " pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.680823 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93ee07df-1490-4be4-97d0-f3f5b20ceb90-cert\") pod \"controller-774d88f846-k2qfn\" (UID: \"93ee07df-1490-4be4-97d0-f3f5b20ceb90\") " pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:18 crc kubenswrapper[5184]: E0312 17:04:18.680871 5184 secret.go:189] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 12 17:04:18 crc kubenswrapper[5184]: E0312 17:04:18.680923 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93ee07df-1490-4be4-97d0-f3f5b20ceb90-metrics-certs podName:93ee07df-1490-4be4-97d0-f3f5b20ceb90 nodeName:}" failed. No retries permitted until 2026-03-12 17:04:19.180906933 +0000 UTC m=+801.722218272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93ee07df-1490-4be4-97d0-f3f5b20ceb90-metrics-certs") pod "controller-774d88f846-k2qfn" (UID: "93ee07df-1490-4be4-97d0-f3f5b20ceb90") : secret "controller-certs-secret" not found Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.681247 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/adae6bbe-80fb-4692-b73f-402356ce10c4-metallb-excludel2\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.684330 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"metallb-system\"/\"metallb-webhook-cert\"" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.700613 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5sl2\" (UniqueName: \"kubernetes.io/projected/adae6bbe-80fb-4692-b73f-402356ce10c4-kube-api-access-l5sl2\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.700699 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93ee07df-1490-4be4-97d0-f3f5b20ceb90-cert\") pod \"controller-774d88f846-k2qfn\" (UID: \"93ee07df-1490-4be4-97d0-f3f5b20ceb90\") " pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:18 crc kubenswrapper[5184]: I0312 17:04:18.712176 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrmcz\" (UniqueName: \"kubernetes.io/projected/93ee07df-1490-4be4-97d0-f3f5b20ceb90-kube-api-access-rrmcz\") pod \"controller-774d88f846-k2qfn\" (UID: \"93ee07df-1490-4be4-97d0-f3f5b20ceb90\") " pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:19 crc kubenswrapper[5184]: I0312 17:04:19.187506 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-memberlist\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:19 crc kubenswrapper[5184]: I0312 17:04:19.187791 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-metrics-certs\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:19 crc kubenswrapper[5184]: E0312 17:04:19.187653 5184 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 12 17:04:19 crc kubenswrapper[5184]: E0312 17:04:19.187914 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-memberlist podName:adae6bbe-80fb-4692-b73f-402356ce10c4 nodeName:}" failed. No retries permitted until 2026-03-12 17:04:20.187893223 +0000 UTC m=+802.729204562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-memberlist") pod "speaker-5ktvq" (UID: "adae6bbe-80fb-4692-b73f-402356ce10c4") : secret "metallb-memberlist" not found Mar 12 17:04:19 crc kubenswrapper[5184]: I0312 17:04:19.187849 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93ee07df-1490-4be4-97d0-f3f5b20ceb90-metrics-certs\") pod \"controller-774d88f846-k2qfn\" (UID: \"93ee07df-1490-4be4-97d0-f3f5b20ceb90\") " pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:19 crc kubenswrapper[5184]: I0312 17:04:19.194593 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93ee07df-1490-4be4-97d0-f3f5b20ceb90-metrics-certs\") pod \"controller-774d88f846-k2qfn\" (UID: \"93ee07df-1490-4be4-97d0-f3f5b20ceb90\") " pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:19 crc kubenswrapper[5184]: I0312 17:04:19.196218 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-metrics-certs\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:19 crc kubenswrapper[5184]: I0312 17:04:19.429698 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:19 crc kubenswrapper[5184]: I0312 17:04:19.669071 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-774d88f846-k2qfn"] Mar 12 17:04:20 crc kubenswrapper[5184]: I0312 17:04:20.202470 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-memberlist\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:20 crc kubenswrapper[5184]: I0312 17:04:20.208583 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/adae6bbe-80fb-4692-b73f-402356ce10c4-memberlist\") pod \"speaker-5ktvq\" (UID: \"adae6bbe-80fb-4692-b73f-402356ce10c4\") " pod="metallb-system/speaker-5ktvq" Mar 12 17:04:20 crc kubenswrapper[5184]: I0312 17:04:20.322582 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5ktvq" Mar 12 17:04:20 crc kubenswrapper[5184]: W0312 17:04:20.341495 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadae6bbe_80fb_4692_b73f_402356ce10c4.slice/crio-6a75c5f31cf1c6652b485f6fb19bf12f6ab7897af3e88fc2e971afe03e2ca155 WatchSource:0}: Error finding container 6a75c5f31cf1c6652b485f6fb19bf12f6ab7897af3e88fc2e971afe03e2ca155: Status 404 returned error can't find the container with id 6a75c5f31cf1c6652b485f6fb19bf12f6ab7897af3e88fc2e971afe03e2ca155 Mar 12 17:04:20 crc kubenswrapper[5184]: I0312 17:04:20.686075 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-774d88f846-k2qfn" event={"ID":"93ee07df-1490-4be4-97d0-f3f5b20ceb90","Type":"ContainerStarted","Data":"24e1b1f255f54bfa005cda807258b587a8ca38adbaa7580439de1eac7230d8b8"} Mar 12 17:04:20 crc kubenswrapper[5184]: I0312 17:04:20.686143 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-774d88f846-k2qfn" event={"ID":"93ee07df-1490-4be4-97d0-f3f5b20ceb90","Type":"ContainerStarted","Data":"512868b4a86b98dceaf00dc5199751711778c0a16ab01039f1f918a85943ad57"} Mar 12 17:04:20 crc kubenswrapper[5184]: I0312 17:04:20.686165 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-774d88f846-k2qfn" event={"ID":"93ee07df-1490-4be4-97d0-f3f5b20ceb90","Type":"ContainerStarted","Data":"e4927eb8beee298af9449c63b36a3dc27bf0a99d58259bf1bc5c3450d5053ba7"} Mar 12 17:04:20 crc kubenswrapper[5184]: I0312 17:04:20.686192 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:20 crc kubenswrapper[5184]: I0312 17:04:20.687745 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5ktvq" event={"ID":"adae6bbe-80fb-4692-b73f-402356ce10c4","Type":"ContainerStarted","Data":"b75a7c35a63fedc5820874be22caa9476e614844af1767c5fff31e907572ddc9"} Mar 12 17:04:20 crc kubenswrapper[5184]: I0312 17:04:20.687782 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5ktvq" event={"ID":"adae6bbe-80fb-4692-b73f-402356ce10c4","Type":"ContainerStarted","Data":"6a75c5f31cf1c6652b485f6fb19bf12f6ab7897af3e88fc2e971afe03e2ca155"} Mar 12 17:04:20 crc kubenswrapper[5184]: I0312 17:04:20.722338 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-774d88f846-k2qfn" podStartSLOduration=2.722319958 podStartE2EDuration="2.722319958s" podCreationTimestamp="2026-03-12 17:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:04:20.71987801 +0000 UTC m=+803.261189379" watchObservedRunningTime="2026-03-12 17:04:20.722319958 +0000 UTC m=+803.263631297" Mar 12 17:04:21 crc kubenswrapper[5184]: I0312 17:04:21.699811 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5ktvq" event={"ID":"adae6bbe-80fb-4692-b73f-402356ce10c4","Type":"ContainerStarted","Data":"798013de7645cf3bcb46934aa64e69c3dea3ef01a3bdce9f33c7ef643f487cfa"} Mar 12 17:04:21 crc kubenswrapper[5184]: I0312 17:04:21.700056 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="metallb-system/speaker-5ktvq" Mar 12 17:04:21 crc kubenswrapper[5184]: I0312 17:04:21.743611 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5ktvq" podStartSLOduration=3.743584768 podStartE2EDuration="3.743584768s" podCreationTimestamp="2026-03-12 17:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:04:21.741849443 +0000 UTC m=+804.283160812" watchObservedRunningTime="2026-03-12 17:04:21.743584768 +0000 UTC m=+804.284896127" Mar 12 17:04:31 crc kubenswrapper[5184]: I0312 17:04:31.706077 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-774d88f846-k2qfn" Mar 12 17:04:32 crc kubenswrapper[5184]: I0312 17:04:32.709701 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5ktvq" Mar 12 17:04:35 crc kubenswrapper[5184]: I0312 17:04:35.581346 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9w9vg"] Mar 12 17:04:35 crc kubenswrapper[5184]: I0312 17:04:35.591360 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9w9vg"] Mar 12 17:04:35 crc kubenswrapper[5184]: I0312 17:04:35.591566 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9w9vg" Mar 12 17:04:35 crc kubenswrapper[5184]: I0312 17:04:35.594397 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack-operators\"/\"kube-root-ca.crt\"" Mar 12 17:04:35 crc kubenswrapper[5184]: I0312 17:04:35.594791 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-operator-index-dockercfg-fcfg6\"" Mar 12 17:04:35 crc kubenswrapper[5184]: I0312 17:04:35.598765 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack-operators\"/\"openshift-service-ca.crt\"" Mar 12 17:04:35 crc kubenswrapper[5184]: I0312 17:04:35.608027 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnqtb\" (UniqueName: \"kubernetes.io/projected/15766085-9061-4536-9d58-02c6a69db272-kube-api-access-rnqtb\") pod \"openstack-operator-index-9w9vg\" (UID: \"15766085-9061-4536-9d58-02c6a69db272\") " pod="openstack-operators/openstack-operator-index-9w9vg" Mar 12 17:04:35 crc kubenswrapper[5184]: I0312 17:04:35.709255 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnqtb\" (UniqueName: \"kubernetes.io/projected/15766085-9061-4536-9d58-02c6a69db272-kube-api-access-rnqtb\") pod \"openstack-operator-index-9w9vg\" (UID: \"15766085-9061-4536-9d58-02c6a69db272\") " pod="openstack-operators/openstack-operator-index-9w9vg" Mar 12 17:04:35 crc kubenswrapper[5184]: I0312 17:04:35.730969 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnqtb\" (UniqueName: \"kubernetes.io/projected/15766085-9061-4536-9d58-02c6a69db272-kube-api-access-rnqtb\") pod \"openstack-operator-index-9w9vg\" (UID: \"15766085-9061-4536-9d58-02c6a69db272\") " pod="openstack-operators/openstack-operator-index-9w9vg" Mar 12 17:04:35 crc kubenswrapper[5184]: I0312 17:04:35.909350 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9w9vg" Mar 12 17:04:36 crc kubenswrapper[5184]: I0312 17:04:36.332635 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9w9vg"] Mar 12 17:04:36 crc kubenswrapper[5184]: I0312 17:04:36.816217 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9w9vg" event={"ID":"15766085-9061-4536-9d58-02c6a69db272","Type":"ContainerStarted","Data":"1f0b51c46221cd21afcafdb7a4b4b3e24366dd6c29f60dd2dc83fd99bedf22a5"} Mar 12 17:04:39 crc kubenswrapper[5184]: I0312 17:04:39.848873 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9w9vg" event={"ID":"15766085-9061-4536-9d58-02c6a69db272","Type":"ContainerStarted","Data":"4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06"} Mar 12 17:04:39 crc kubenswrapper[5184]: I0312 17:04:39.879989 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9w9vg" podStartSLOduration=2.545557784 podStartE2EDuration="4.879924487s" podCreationTimestamp="2026-03-12 17:04:35 +0000 UTC" firstStartedPulling="2026-03-12 17:04:36.34156447 +0000 UTC m=+818.882875809" lastFinishedPulling="2026-03-12 17:04:38.675931133 +0000 UTC m=+821.217242512" observedRunningTime="2026-03-12 17:04:39.87277054 +0000 UTC m=+822.414081939" watchObservedRunningTime="2026-03-12 17:04:39.879924487 +0000 UTC m=+822.421235866" Mar 12 17:04:39 crc kubenswrapper[5184]: I0312 17:04:39.951956 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9w9vg"] Mar 12 17:04:40 crc kubenswrapper[5184]: I0312 17:04:40.567100 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wwk4r"] Mar 12 17:04:40 crc kubenswrapper[5184]: I0312 17:04:40.571917 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wwk4r" Mar 12 17:04:40 crc kubenswrapper[5184]: I0312 17:04:40.572334 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wwk4r"] Mar 12 17:04:40 crc kubenswrapper[5184]: I0312 17:04:40.684673 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbf7\" (UniqueName: \"kubernetes.io/projected/75472062-da1b-4ccc-9522-e4560eb84997-kube-api-access-hdbf7\") pod \"openstack-operator-index-wwk4r\" (UID: \"75472062-da1b-4ccc-9522-e4560eb84997\") " pod="openstack-operators/openstack-operator-index-wwk4r" Mar 12 17:04:40 crc kubenswrapper[5184]: I0312 17:04:40.786740 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbf7\" (UniqueName: \"kubernetes.io/projected/75472062-da1b-4ccc-9522-e4560eb84997-kube-api-access-hdbf7\") pod \"openstack-operator-index-wwk4r\" (UID: \"75472062-da1b-4ccc-9522-e4560eb84997\") " pod="openstack-operators/openstack-operator-index-wwk4r" Mar 12 17:04:40 crc kubenswrapper[5184]: I0312 17:04:40.811453 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdbf7\" (UniqueName: \"kubernetes.io/projected/75472062-da1b-4ccc-9522-e4560eb84997-kube-api-access-hdbf7\") pod \"openstack-operator-index-wwk4r\" (UID: \"75472062-da1b-4ccc-9522-e4560eb84997\") " pod="openstack-operators/openstack-operator-index-wwk4r" Mar 12 17:04:40 crc kubenswrapper[5184]: I0312 17:04:40.900236 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wwk4r" Mar 12 17:04:41 crc kubenswrapper[5184]: I0312 17:04:41.396333 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wwk4r"] Mar 12 17:04:41 crc kubenswrapper[5184]: I0312 17:04:41.863032 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wwk4r" event={"ID":"75472062-da1b-4ccc-9522-e4560eb84997","Type":"ContainerStarted","Data":"16894299d9f7eec137ca80a4166c431ed483d90c7e8812bb04a2a3fdb4821c88"} Mar 12 17:04:41 crc kubenswrapper[5184]: I0312 17:04:41.863323 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wwk4r" event={"ID":"75472062-da1b-4ccc-9522-e4560eb84997","Type":"ContainerStarted","Data":"8c4041f4af153573d65e366130b768e729dff865b0ecc995bcfc155a16890924"} Mar 12 17:04:41 crc kubenswrapper[5184]: I0312 17:04:41.863263 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-9w9vg" podUID="15766085-9061-4536-9d58-02c6a69db272" containerName="registry-server" containerID="cri-o://4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06" gracePeriod=2 Mar 12 17:04:41 crc kubenswrapper[5184]: I0312 17:04:41.885428 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wwk4r" podStartSLOduration=1.838297447 podStartE2EDuration="1.885409076s" podCreationTimestamp="2026-03-12 17:04:40 +0000 UTC" firstStartedPulling="2026-03-12 17:04:41.400302157 +0000 UTC m=+823.941613496" lastFinishedPulling="2026-03-12 17:04:41.447413776 +0000 UTC m=+823.988725125" observedRunningTime="2026-03-12 17:04:41.879743477 +0000 UTC m=+824.421054816" watchObservedRunningTime="2026-03-12 17:04:41.885409076 +0000 UTC m=+824.426720435" Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.262674 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9w9vg" Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.419844 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnqtb\" (UniqueName: \"kubernetes.io/projected/15766085-9061-4536-9d58-02c6a69db272-kube-api-access-rnqtb\") pod \"15766085-9061-4536-9d58-02c6a69db272\" (UID: \"15766085-9061-4536-9d58-02c6a69db272\") " Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.426214 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15766085-9061-4536-9d58-02c6a69db272-kube-api-access-rnqtb" (OuterVolumeSpecName: "kube-api-access-rnqtb") pod "15766085-9061-4536-9d58-02c6a69db272" (UID: "15766085-9061-4536-9d58-02c6a69db272"). InnerVolumeSpecName "kube-api-access-rnqtb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.522448 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rnqtb\" (UniqueName: \"kubernetes.io/projected/15766085-9061-4536-9d58-02c6a69db272-kube-api-access-rnqtb\") on node \"crc\" DevicePath \"\"" Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.875906 5184 generic.go:358] "Generic (PLEG): container finished" podID="15766085-9061-4536-9d58-02c6a69db272" containerID="4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06" exitCode=0 Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.876020 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9w9vg" Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.876034 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9w9vg" event={"ID":"15766085-9061-4536-9d58-02c6a69db272","Type":"ContainerDied","Data":"4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06"} Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.876091 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9w9vg" event={"ID":"15766085-9061-4536-9d58-02c6a69db272","Type":"ContainerDied","Data":"1f0b51c46221cd21afcafdb7a4b4b3e24366dd6c29f60dd2dc83fd99bedf22a5"} Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.876113 5184 scope.go:117] "RemoveContainer" containerID="4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06" Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.900920 5184 scope.go:117] "RemoveContainer" containerID="4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06" Mar 12 17:04:42 crc kubenswrapper[5184]: E0312 17:04:42.901715 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06\": container with ID starting with 4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06 not found: ID does not exist" containerID="4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06" Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.901757 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06"} err="failed to get container status \"4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06\": rpc error: code = NotFound desc = could not find container \"4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06\": container with ID starting with 4f5ebe4746c5611e2afe5fbd76266771fb3be13f4997661dd5879c5d8d26ca06 not found: ID does not exist" Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.920606 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9w9vg"] Mar 12 17:04:42 crc kubenswrapper[5184]: I0312 17:04:42.928729 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-9w9vg"] Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.885438 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-frr-k8s/frr-k8s-gfrgb"] Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.886650 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15766085-9061-4536-9d58-02c6a69db272" containerName="registry-server" Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.886669 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="15766085-9061-4536-9d58-02c6a69db272" containerName="registry-server" Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.886813 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="15766085-9061-4536-9d58-02c6a69db272" containerName="registry-server" Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.898216 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.900598 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-frr-k8s\"/\"frr-k8s-certs-secret\"" Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.901075 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-frr-k8s\"/\"frr-k8s-daemon-dockercfg-vgtl7\"" Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.903883 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-frr-k8s\"/\"kube-root-ca.crt\"" Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.904239 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-frr-k8s\"/\"openshift-service-ca.crt\"" Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.905328 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-frr-k8s\"/\"frr-startup\"" Mar 12 17:04:43 crc kubenswrapper[5184]: I0312 17:04:43.908109 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-frr-k8s\"/\"env-overrides\"" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.051304 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-sockets\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.051478 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-reloader\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.051689 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.051779 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-status\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-status\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.051942 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-startup\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.052046 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbsdz\" (UniqueName: \"kubernetes.io/projected/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-kube-api-access-bbsdz\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.052165 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-conf\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.052196 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics-certs\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153036 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-startup\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153095 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbsdz\" (UniqueName: \"kubernetes.io/projected/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-kube-api-access-bbsdz\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153221 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-conf\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153249 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics-certs\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153293 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-sockets\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: E0312 17:04:44.153399 5184 secret.go:189] Couldn't get secret openshift-frr-k8s/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153422 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-reloader\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153453 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: E0312 17:04:44.153474 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics-certs podName:4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d nodeName:}" failed. No retries permitted until 2026-03-12 17:04:44.653454032 +0000 UTC m=+827.194765371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics-certs") pod "frr-k8s-gfrgb" (UID: "4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d") : secret "frr-k8s-certs-secret" not found Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153512 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"frr-status\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-status\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153751 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-conf\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153797 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.153827 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-reloader\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.154081 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-startup\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.154568 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-sockets\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.154613 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"frr-status\" (UniqueName: \"kubernetes.io/empty-dir/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-frr-status\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.176244 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbsdz\" (UniqueName: \"kubernetes.io/projected/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-kube-api-access-bbsdz\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.411405 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15766085-9061-4536-9d58-02c6a69db272" path="/var/lib/kubelet/pods/15766085-9061-4536-9d58-02c6a69db272/volumes" Mar 12 17:04:44 crc kubenswrapper[5184]: I0312 17:04:44.660086 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics-certs\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:44 crc kubenswrapper[5184]: E0312 17:04:44.660346 5184 secret.go:189] Couldn't get secret openshift-frr-k8s/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 12 17:04:44 crc kubenswrapper[5184]: E0312 17:04:44.660549 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics-certs podName:4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d nodeName:}" failed. No retries permitted until 2026-03-12 17:04:45.660508494 +0000 UTC m=+828.201819883 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics-certs") pod "frr-k8s-gfrgb" (UID: "4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d") : secret "frr-k8s-certs-secret" not found Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.253000 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r"] Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.260070 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.262824 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-frr-k8s\"/\"frr-k8s-webhook-server-cert\"" Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.370993 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1862551d-7a27-401a-97d1-10bc1e447eff-cert\") pod \"frr-k8s-webhook-server-bc5694f79-h5h7r\" (UID: \"1862551d-7a27-401a-97d1-10bc1e447eff\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.371155 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kpkr\" (UniqueName: \"kubernetes.io/projected/1862551d-7a27-401a-97d1-10bc1e447eff-kube-api-access-8kpkr\") pod \"frr-k8s-webhook-server-bc5694f79-h5h7r\" (UID: \"1862551d-7a27-401a-97d1-10bc1e447eff\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.472180 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kpkr\" (UniqueName: \"kubernetes.io/projected/1862551d-7a27-401a-97d1-10bc1e447eff-kube-api-access-8kpkr\") pod \"frr-k8s-webhook-server-bc5694f79-h5h7r\" (UID: \"1862551d-7a27-401a-97d1-10bc1e447eff\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.472339 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1862551d-7a27-401a-97d1-10bc1e447eff-cert\") pod \"frr-k8s-webhook-server-bc5694f79-h5h7r\" (UID: \"1862551d-7a27-401a-97d1-10bc1e447eff\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:04:45 crc kubenswrapper[5184]: E0312 17:04:45.472593 5184 secret.go:189] Couldn't get secret openshift-frr-k8s/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 12 17:04:45 crc kubenswrapper[5184]: E0312 17:04:45.472714 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1862551d-7a27-401a-97d1-10bc1e447eff-cert podName:1862551d-7a27-401a-97d1-10bc1e447eff nodeName:}" failed. No retries permitted until 2026-03-12 17:04:45.972685428 +0000 UTC m=+828.513996807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1862551d-7a27-401a-97d1-10bc1e447eff-cert") pod "frr-k8s-webhook-server-bc5694f79-h5h7r" (UID: "1862551d-7a27-401a-97d1-10bc1e447eff") : secret "frr-k8s-webhook-server-cert" not found Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.513424 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kpkr\" (UniqueName: \"kubernetes.io/projected/1862551d-7a27-401a-97d1-10bc1e447eff-kube-api-access-8kpkr\") pod \"frr-k8s-webhook-server-bc5694f79-h5h7r\" (UID: \"1862551d-7a27-401a-97d1-10bc1e447eff\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.675307 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics-certs\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.679951 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d-metrics-certs\") pod \"frr-k8s-gfrgb\" (UID: \"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d\") " pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.733811 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.978934 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1862551d-7a27-401a-97d1-10bc1e447eff-cert\") pod \"frr-k8s-webhook-server-bc5694f79-h5h7r\" (UID: \"1862551d-7a27-401a-97d1-10bc1e447eff\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:04:45 crc kubenswrapper[5184]: I0312 17:04:45.995049 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1862551d-7a27-401a-97d1-10bc1e447eff-cert\") pod \"frr-k8s-webhook-server-bc5694f79-h5h7r\" (UID: \"1862551d-7a27-401a-97d1-10bc1e447eff\") " pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:04:46 crc kubenswrapper[5184]: I0312 17:04:46.182776 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:04:46 crc kubenswrapper[5184]: W0312 17:04:46.215790 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1862551d_7a27_401a_97d1_10bc1e447eff.slice/crio-dd9cd54c3cf14c9339a02186ba40215e1b9d3c0461361ea489f62e3a4f129806 WatchSource:0}: Error finding container dd9cd54c3cf14c9339a02186ba40215e1b9d3c0461361ea489f62e3a4f129806: Status 404 returned error can't find the container with id dd9cd54c3cf14c9339a02186ba40215e1b9d3c0461361ea489f62e3a4f129806 Mar 12 17:04:46 crc kubenswrapper[5184]: I0312 17:04:46.910135 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerStarted","Data":"5ee92bc2aca28d7fa2ed935f2c643f6c35ebe8d15bbcb4498e6c0db191ff5716"} Mar 12 17:04:46 crc kubenswrapper[5184]: I0312 17:04:46.911149 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" event={"ID":"1862551d-7a27-401a-97d1-10bc1e447eff","Type":"ContainerStarted","Data":"dd9cd54c3cf14c9339a02186ba40215e1b9d3c0461361ea489f62e3a4f129806"} Mar 12 17:04:48 crc kubenswrapper[5184]: E0312 17:04:48.319363 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=763061 actualBytes=10240 Mar 12 17:04:50 crc kubenswrapper[5184]: I0312 17:04:50.900767 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-wwk4r" Mar 12 17:04:50 crc kubenswrapper[5184]: I0312 17:04:50.901102 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/openstack-operator-index-wwk4r" Mar 12 17:04:50 crc kubenswrapper[5184]: I0312 17:04:50.935540 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-wwk4r" Mar 12 17:04:50 crc kubenswrapper[5184]: I0312 17:04:50.972623 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-wwk4r" Mar 12 17:04:51 crc kubenswrapper[5184]: I0312 17:04:51.955723 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" event={"ID":"1862551d-7a27-401a-97d1-10bc1e447eff","Type":"ContainerStarted","Data":"af5f328ca1f01aa916459a1095c5f7584bd0d2164b34f5886de05dc1c3e6af50"} Mar 12 17:04:51 crc kubenswrapper[5184]: I0312 17:04:51.956214 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:04:51 crc kubenswrapper[5184]: I0312 17:04:51.960281 5184 generic.go:358] "Generic (PLEG): container finished" podID="4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d" containerID="70268412ebc63136d76dbb840f4baf10f323cf02170619aae95baa6ee9bf1438" exitCode=0 Mar 12 17:04:51 crc kubenswrapper[5184]: I0312 17:04:51.960878 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerDied","Data":"70268412ebc63136d76dbb840f4baf10f323cf02170619aae95baa6ee9bf1438"} Mar 12 17:04:51 crc kubenswrapper[5184]: I0312 17:04:51.980813 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" podStartSLOduration=1.800342109 podStartE2EDuration="6.980782082s" podCreationTimestamp="2026-03-12 17:04:45 +0000 UTC" firstStartedPulling="2026-03-12 17:04:46.218797033 +0000 UTC m=+828.760108382" lastFinishedPulling="2026-03-12 17:04:51.399237016 +0000 UTC m=+833.940548355" observedRunningTime="2026-03-12 17:04:51.976939141 +0000 UTC m=+834.518250510" watchObservedRunningTime="2026-03-12 17:04:51.980782082 +0000 UTC m=+834.522093461" Mar 12 17:04:52 crc kubenswrapper[5184]: I0312 17:04:52.973246 5184 generic.go:358] "Generic (PLEG): container finished" podID="4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d" containerID="f824088804d7a72bf9acd6f9d17b30052636b2de6df56dd10e5b2512d5db6a13" exitCode=0 Mar 12 17:04:52 crc kubenswrapper[5184]: I0312 17:04:52.973336 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerDied","Data":"f824088804d7a72bf9acd6f9d17b30052636b2de6df56dd10e5b2512d5db6a13"} Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.022906 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h"] Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.029926 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.031293 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h"] Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.035701 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"default-dockercfg-7s9xf\"" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.082811 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-bundle\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.082877 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-util\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.082917 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj2c9\" (UniqueName: \"kubernetes.io/projected/01a6e05d-ea1c-47f7-a88c-073127e41f25-kube-api-access-rj2c9\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.183553 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-bundle\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.183924 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-util\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.183980 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-bundle\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.183985 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rj2c9\" (UniqueName: \"kubernetes.io/projected/01a6e05d-ea1c-47f7-a88c-073127e41f25-kube-api-access-rj2c9\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.184472 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-util\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.215503 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj2c9\" (UniqueName: \"kubernetes.io/projected/01a6e05d-ea1c-47f7-a88c-073127e41f25-kube-api-access-rj2c9\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.391180 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.806521 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h"] Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.983040 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" event={"ID":"01a6e05d-ea1c-47f7-a88c-073127e41f25","Type":"ContainerStarted","Data":"088d2afd9f5ad0b5e4cf407378b8bc69e03295ece13708e1388336bdac7a36b7"} Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.983108 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" event={"ID":"01a6e05d-ea1c-47f7-a88c-073127e41f25","Type":"ContainerStarted","Data":"60d84ba01873423f3a88ff38f8ab0e8c78fa63199a5affde2c457a496bb1c5b6"} Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.986891 5184 generic.go:358] "Generic (PLEG): container finished" podID="4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d" containerID="ac6963b4144932847131535c912e62523ef5d4529e18dda1de363b2a20d70c3f" exitCode=0 Mar 12 17:04:53 crc kubenswrapper[5184]: I0312 17:04:53.987073 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerDied","Data":"ac6963b4144932847131535c912e62523ef5d4529e18dda1de363b2a20d70c3f"} Mar 12 17:04:54 crc kubenswrapper[5184]: I0312 17:04:54.997970 5184 generic.go:358] "Generic (PLEG): container finished" podID="01a6e05d-ea1c-47f7-a88c-073127e41f25" containerID="088d2afd9f5ad0b5e4cf407378b8bc69e03295ece13708e1388336bdac7a36b7" exitCode=0 Mar 12 17:04:55 crc kubenswrapper[5184]: I0312 17:04:54.998057 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" event={"ID":"01a6e05d-ea1c-47f7-a88c-073127e41f25","Type":"ContainerDied","Data":"088d2afd9f5ad0b5e4cf407378b8bc69e03295ece13708e1388336bdac7a36b7"} Mar 12 17:04:55 crc kubenswrapper[5184]: I0312 17:04:55.006142 5184 generic.go:358] "Generic (PLEG): container finished" podID="4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d" containerID="737d586186c08f35bab9f2cd6da0c9226fe27d8074504ebd5a464ffcca449099" exitCode=0 Mar 12 17:04:55 crc kubenswrapper[5184]: I0312 17:04:55.006237 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerDied","Data":"737d586186c08f35bab9f2cd6da0c9226fe27d8074504ebd5a464ffcca449099"} Mar 12 17:04:56 crc kubenswrapper[5184]: I0312 17:04:56.015495 5184 generic.go:358] "Generic (PLEG): container finished" podID="01a6e05d-ea1c-47f7-a88c-073127e41f25" containerID="f6a40935ba86c7516110235245e83704c50fc6eb00be90d1cc0da23326e0bd7b" exitCode=0 Mar 12 17:04:56 crc kubenswrapper[5184]: I0312 17:04:56.015628 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" event={"ID":"01a6e05d-ea1c-47f7-a88c-073127e41f25","Type":"ContainerDied","Data":"f6a40935ba86c7516110235245e83704c50fc6eb00be90d1cc0da23326e0bd7b"} Mar 12 17:04:56 crc kubenswrapper[5184]: I0312 17:04:56.022097 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerStarted","Data":"455464a149e7246a69dcfcb84841409e403ea318a74e869e67ee40972dbf51c6"} Mar 12 17:04:56 crc kubenswrapper[5184]: I0312 17:04:56.022153 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerStarted","Data":"677179f066f17afc57ca67ec34385fecb2c96673d291295dc8dd9ea104745d59"} Mar 12 17:04:56 crc kubenswrapper[5184]: I0312 17:04:56.022173 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerStarted","Data":"8fd35cb9efad927e5c4f192146bf9c7d6a9f4ef0eb028fe515a4fa1f15639f38"} Mar 12 17:04:56 crc kubenswrapper[5184]: I0312 17:04:56.022187 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerStarted","Data":"ea79c1ef5304b3de8503b6f6dcdfc7d03911b324020b9feb6fb9ba6625d9bcbd"} Mar 12 17:04:57 crc kubenswrapper[5184]: I0312 17:04:57.037527 5184 generic.go:358] "Generic (PLEG): container finished" podID="01a6e05d-ea1c-47f7-a88c-073127e41f25" containerID="25a085eb2f51550b69d2615a175bcbc3921cabb9d8964ede45a520de89fd7281" exitCode=0 Mar 12 17:04:57 crc kubenswrapper[5184]: I0312 17:04:57.037725 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" event={"ID":"01a6e05d-ea1c-47f7-a88c-073127e41f25","Type":"ContainerDied","Data":"25a085eb2f51550b69d2615a175bcbc3921cabb9d8964ede45a520de89fd7281"} Mar 12 17:04:57 crc kubenswrapper[5184]: I0312 17:04:57.046019 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerStarted","Data":"a7de5dafc6e1a7df22000221851368393e18bc42a3471e7eda46961c0f956190"} Mar 12 17:04:57 crc kubenswrapper[5184]: I0312 17:04:57.046087 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerStarted","Data":"39d08f7284a6d0c8ccd8dbf15ef96a9aab16bf48766360aa5f6bef3030318659"} Mar 12 17:04:57 crc kubenswrapper[5184]: I0312 17:04:57.046099 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-frr-k8s/frr-k8s-gfrgb" event={"ID":"4a6cd8fb-9c58-4eb4-ab2e-bb51fba7b46d","Type":"ContainerStarted","Data":"5253fe015554fe89a6c6b14ff3667083d81ed7411bd3d3b99c35afcf1d4a49fd"} Mar 12 17:04:57 crc kubenswrapper[5184]: I0312 17:04:57.046170 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:04:57 crc kubenswrapper[5184]: I0312 17:04:57.078984 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-frr-k8s/frr-k8s-gfrgb" podStartSLOduration=8.580228756 podStartE2EDuration="14.078965706s" podCreationTimestamp="2026-03-12 17:04:43 +0000 UTC" firstStartedPulling="2026-03-12 17:04:45.908205159 +0000 UTC m=+828.449516508" lastFinishedPulling="2026-03-12 17:04:51.406942119 +0000 UTC m=+833.948253458" observedRunningTime="2026-03-12 17:04:57.075881818 +0000 UTC m=+839.617193157" watchObservedRunningTime="2026-03-12 17:04:57.078965706 +0000 UTC m=+839.620277045" Mar 12 17:04:58 crc kubenswrapper[5184]: I0312 17:04:58.424277 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:58 crc kubenswrapper[5184]: I0312 17:04:58.460562 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj2c9\" (UniqueName: \"kubernetes.io/projected/01a6e05d-ea1c-47f7-a88c-073127e41f25-kube-api-access-rj2c9\") pod \"01a6e05d-ea1c-47f7-a88c-073127e41f25\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " Mar 12 17:04:58 crc kubenswrapper[5184]: I0312 17:04:58.460964 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-util\") pod \"01a6e05d-ea1c-47f7-a88c-073127e41f25\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " Mar 12 17:04:58 crc kubenswrapper[5184]: I0312 17:04:58.461100 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-bundle\") pod \"01a6e05d-ea1c-47f7-a88c-073127e41f25\" (UID: \"01a6e05d-ea1c-47f7-a88c-073127e41f25\") " Mar 12 17:04:58 crc kubenswrapper[5184]: I0312 17:04:58.462340 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-bundle" (OuterVolumeSpecName: "bundle") pod "01a6e05d-ea1c-47f7-a88c-073127e41f25" (UID: "01a6e05d-ea1c-47f7-a88c-073127e41f25"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:04:58 crc kubenswrapper[5184]: I0312 17:04:58.479184 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a6e05d-ea1c-47f7-a88c-073127e41f25-kube-api-access-rj2c9" (OuterVolumeSpecName: "kube-api-access-rj2c9") pod "01a6e05d-ea1c-47f7-a88c-073127e41f25" (UID: "01a6e05d-ea1c-47f7-a88c-073127e41f25"). InnerVolumeSpecName "kube-api-access-rj2c9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:04:58 crc kubenswrapper[5184]: I0312 17:04:58.484828 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-util" (OuterVolumeSpecName: "util") pod "01a6e05d-ea1c-47f7-a88c-073127e41f25" (UID: "01a6e05d-ea1c-47f7-a88c-073127e41f25"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:04:58 crc kubenswrapper[5184]: I0312 17:04:58.562556 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rj2c9\" (UniqueName: \"kubernetes.io/projected/01a6e05d-ea1c-47f7-a88c-073127e41f25-kube-api-access-rj2c9\") on node \"crc\" DevicePath \"\"" Mar 12 17:04:58 crc kubenswrapper[5184]: I0312 17:04:58.562586 5184 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-util\") on node \"crc\" DevicePath \"\"" Mar 12 17:04:58 crc kubenswrapper[5184]: I0312 17:04:58.562598 5184 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01a6e05d-ea1c-47f7-a88c-073127e41f25-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:04:59 crc kubenswrapper[5184]: I0312 17:04:59.063460 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" event={"ID":"01a6e05d-ea1c-47f7-a88c-073127e41f25","Type":"ContainerDied","Data":"60d84ba01873423f3a88ff38f8ab0e8c78fa63199a5affde2c457a496bb1c5b6"} Mar 12 17:04:59 crc kubenswrapper[5184]: I0312 17:04:59.063513 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h" Mar 12 17:04:59 crc kubenswrapper[5184]: I0312 17:04:59.063521 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60d84ba01873423f3a88ff38f8ab0e8c78fa63199a5affde2c457a496bb1c5b6" Mar 12 17:04:59 crc kubenswrapper[5184]: I0312 17:04:59.063904 5184 scope.go:117] "RemoveContainer" containerID="47e8322adaabbd88b5655e48f068f7736ae022676f8e04bbd22876b7c5c1ce5b" Mar 12 17:05:00 crc kubenswrapper[5184]: I0312 17:05:00.734997 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:05:00 crc kubenswrapper[5184]: I0312 17:05:00.809074 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:05:02 crc kubenswrapper[5184]: I0312 17:05:02.981783 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-frr-k8s/frr-k8s-webhook-server-bc5694f79-h5h7r" Mar 12 17:05:04 crc kubenswrapper[5184]: I0312 17:05:04.934563 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts"] Mar 12 17:05:04 crc kubenswrapper[5184]: I0312 17:05:04.935292 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01a6e05d-ea1c-47f7-a88c-073127e41f25" containerName="util" Mar 12 17:05:04 crc kubenswrapper[5184]: I0312 17:05:04.935308 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a6e05d-ea1c-47f7-a88c-073127e41f25" containerName="util" Mar 12 17:05:04 crc kubenswrapper[5184]: I0312 17:05:04.935335 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01a6e05d-ea1c-47f7-a88c-073127e41f25" containerName="extract" Mar 12 17:05:04 crc kubenswrapper[5184]: I0312 17:05:04.935341 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a6e05d-ea1c-47f7-a88c-073127e41f25" containerName="extract" Mar 12 17:05:04 crc kubenswrapper[5184]: I0312 17:05:04.935359 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01a6e05d-ea1c-47f7-a88c-073127e41f25" containerName="pull" Mar 12 17:05:04 crc kubenswrapper[5184]: I0312 17:05:04.935365 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a6e05d-ea1c-47f7-a88c-073127e41f25" containerName="pull" Mar 12 17:05:04 crc kubenswrapper[5184]: I0312 17:05:04.935506 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="01a6e05d-ea1c-47f7-a88c-073127e41f25" containerName="extract" Mar 12 17:05:04 crc kubenswrapper[5184]: I0312 17:05:04.955491 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts" Mar 12 17:05:05 crc kubenswrapper[5184]: I0312 17:05:04.957620 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-operator-controller-init-dockercfg-rdg98\"" Mar 12 17:05:05 crc kubenswrapper[5184]: I0312 17:05:04.959451 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts"] Mar 12 17:05:05 crc kubenswrapper[5184]: I0312 17:05:05.064757 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq9p2\" (UniqueName: \"kubernetes.io/projected/757d5777-73a0-4653-8eba-303a5a8552ec-kube-api-access-kq9p2\") pod \"openstack-operator-controller-init-79c564bd4f-hcmts\" (UID: \"757d5777-73a0-4653-8eba-303a5a8552ec\") " pod="openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts" Mar 12 17:05:05 crc kubenswrapper[5184]: I0312 17:05:05.167448 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq9p2\" (UniqueName: \"kubernetes.io/projected/757d5777-73a0-4653-8eba-303a5a8552ec-kube-api-access-kq9p2\") pod \"openstack-operator-controller-init-79c564bd4f-hcmts\" (UID: \"757d5777-73a0-4653-8eba-303a5a8552ec\") " pod="openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts" Mar 12 17:05:05 crc kubenswrapper[5184]: I0312 17:05:05.199672 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq9p2\" (UniqueName: \"kubernetes.io/projected/757d5777-73a0-4653-8eba-303a5a8552ec-kube-api-access-kq9p2\") pod \"openstack-operator-controller-init-79c564bd4f-hcmts\" (UID: \"757d5777-73a0-4653-8eba-303a5a8552ec\") " pod="openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts" Mar 12 17:05:05 crc kubenswrapper[5184]: I0312 17:05:05.341348 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts" Mar 12 17:05:05 crc kubenswrapper[5184]: W0312 17:05:05.761344 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757d5777_73a0_4653_8eba_303a5a8552ec.slice/crio-952391d396ae19abe0503d6daf017f4270ccdaf43c59b1c71e190cbe5faf251b WatchSource:0}: Error finding container 952391d396ae19abe0503d6daf017f4270ccdaf43c59b1c71e190cbe5faf251b: Status 404 returned error can't find the container with id 952391d396ae19abe0503d6daf017f4270ccdaf43c59b1c71e190cbe5faf251b Mar 12 17:05:05 crc kubenswrapper[5184]: I0312 17:05:05.781316 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts"] Mar 12 17:05:06 crc kubenswrapper[5184]: I0312 17:05:06.124351 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts" event={"ID":"757d5777-73a0-4653-8eba-303a5a8552ec","Type":"ContainerStarted","Data":"952391d396ae19abe0503d6daf017f4270ccdaf43c59b1c71e190cbe5faf251b"} Mar 12 17:05:11 crc kubenswrapper[5184]: I0312 17:05:11.082602 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-frr-k8s/frr-k8s-gfrgb" Mar 12 17:05:12 crc kubenswrapper[5184]: I0312 17:05:12.182622 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts" event={"ID":"757d5777-73a0-4653-8eba-303a5a8552ec","Type":"ContainerStarted","Data":"fa1c1a13e6479a7486d015eed416a0c0505c8ec6fd658abad586b001c3fb961b"} Mar 12 17:05:12 crc kubenswrapper[5184]: I0312 17:05:12.182866 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts" Mar 12 17:05:12 crc kubenswrapper[5184]: I0312 17:05:12.211827 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts" podStartSLOduration=2.936088533 podStartE2EDuration="8.211796947s" podCreationTimestamp="2026-03-12 17:05:04 +0000 UTC" firstStartedPulling="2026-03-12 17:05:05.768170989 +0000 UTC m=+848.309482338" lastFinishedPulling="2026-03-12 17:05:11.043879413 +0000 UTC m=+853.585190752" observedRunningTime="2026-03-12 17:05:12.203227756 +0000 UTC m=+854.744539115" watchObservedRunningTime="2026-03-12 17:05:12.211796947 +0000 UTC m=+854.753108316" Mar 12 17:05:20 crc kubenswrapper[5184]: I0312 17:05:20.742536 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:05:20 crc kubenswrapper[5184]: I0312 17:05:20.743077 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:05:23 crc kubenswrapper[5184]: I0312 17:05:23.190597 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-79c564bd4f-hcmts" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.600726 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.636489 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.636543 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.638891 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.642097 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"barbican-operator-controller-manager-dockercfg-d6crq\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.644504 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.644532 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.644707 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.646920 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"cinder-operator-controller-manager-dockercfg-g78dz\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.647596 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.649508 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"designate-operator-controller-manager-dockercfg-2bpjm\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.650476 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.656172 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.660453 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.665225 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"glance-operator-controller-manager-dockercfg-lv2zm\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.671614 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.707894 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d9587945-trclj"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.712974 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d9587945-trclj" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.715042 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"heat-operator-controller-manager-dockercfg-rsfqv\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.722975 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d9587945-trclj"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.739937 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.744493 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.748132 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"horizon-operator-controller-manager-dockercfg-l55r6\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.768824 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.773484 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.774369 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wddkv\" (UniqueName: \"kubernetes.io/projected/71cd8922-2260-4302-b49f-8ebb0084bc3a-kube-api-access-wddkv\") pod \"designate-operator-controller-manager-c845c877d-sh7h4\" (UID: \"71cd8922-2260-4302-b49f-8ebb0084bc3a\") " pod="openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.774474 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ft9v\" (UniqueName: \"kubernetes.io/projected/9aacc6d0-007b-4eff-95c0-1e6347226980-kube-api-access-2ft9v\") pod \"glance-operator-controller-manager-6f84c59bb4-9vdqk\" (UID: \"9aacc6d0-007b-4eff-95c0-1e6347226980\") " pod="openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.774506 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62x5m\" (UniqueName: \"kubernetes.io/projected/d1faa83d-1fb7-4c0a-8358-6b02b46d6c9f-kube-api-access-62x5m\") pod \"heat-operator-controller-manager-69d9587945-trclj\" (UID: \"d1faa83d-1fb7-4c0a-8358-6b02b46d6c9f\") " pod="openstack-operators/heat-operator-controller-manager-69d9587945-trclj" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.774546 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzlmb\" (UniqueName: \"kubernetes.io/projected/66eb4c90-9578-461e-aaae-6385546ed865-kube-api-access-gzlmb\") pod \"barbican-operator-controller-manager-68b4f9dfcc-twj9v\" (UID: \"66eb4c90-9578-461e-aaae-6385546ed865\") " pod="openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.774597 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7l8v\" (UniqueName: \"kubernetes.io/projected/ba537fdf-14f9-47e1-a8c6-4732c4d9dfeb-kube-api-access-m7l8v\") pod \"cinder-operator-controller-manager-6564988d95-s8gjk\" (UID: \"ba537fdf-14f9-47e1-a8c6-4732c4d9dfeb\") " pod="openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.777798 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.782744 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"infra-operator-controller-manager-dockercfg-6pxpn\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.782812 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"infra-operator-webhook-server-cert\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.804955 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.808992 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.808993 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.823488 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"ironic-operator-controller-manager-dockercfg-g8k7k\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.835441 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-849569668d-fm84v"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.848265 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.848300 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-849569668d-fm84v"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.848420 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-849569668d-fm84v" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.862435 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.864622 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"keystone-operator-controller-manager-dockercfg-ff8vz\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.866779 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.871273 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"manila-operator-controller-manager-dockercfg-ctld9\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.876464 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wddkv\" (UniqueName: \"kubernetes.io/projected/71cd8922-2260-4302-b49f-8ebb0084bc3a-kube-api-access-wddkv\") pod \"designate-operator-controller-manager-c845c877d-sh7h4\" (UID: \"71cd8922-2260-4302-b49f-8ebb0084bc3a\") " pod="openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.876964 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94wdw\" (UniqueName: \"kubernetes.io/projected/4efa9263-cab7-4221-b570-90c929ebf82b-kube-api-access-94wdw\") pod \"ironic-operator-controller-manager-579966755f-k6ws5\" (UID: \"4efa9263-cab7-4221-b570-90c929ebf82b\") " pod="openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.877018 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ft9v\" (UniqueName: \"kubernetes.io/projected/9aacc6d0-007b-4eff-95c0-1e6347226980-kube-api-access-2ft9v\") pod \"glance-operator-controller-manager-6f84c59bb4-9vdqk\" (UID: \"9aacc6d0-007b-4eff-95c0-1e6347226980\") " pod="openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.877048 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62x5m\" (UniqueName: \"kubernetes.io/projected/d1faa83d-1fb7-4c0a-8358-6b02b46d6c9f-kube-api-access-62x5m\") pod \"heat-operator-controller-manager-69d9587945-trclj\" (UID: \"d1faa83d-1fb7-4c0a-8358-6b02b46d6c9f\") " pod="openstack-operators/heat-operator-controller-manager-69d9587945-trclj" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.877107 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzlmb\" (UniqueName: \"kubernetes.io/projected/66eb4c90-9578-461e-aaae-6385546ed865-kube-api-access-gzlmb\") pod \"barbican-operator-controller-manager-68b4f9dfcc-twj9v\" (UID: \"66eb4c90-9578-461e-aaae-6385546ed865\") " pod="openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.877145 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7l8v\" (UniqueName: \"kubernetes.io/projected/ba537fdf-14f9-47e1-a8c6-4732c4d9dfeb-kube-api-access-m7l8v\") pod \"cinder-operator-controller-manager-6564988d95-s8gjk\" (UID: \"ba537fdf-14f9-47e1-a8c6-4732c4d9dfeb\") " pod="openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.877254 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsw7h\" (UniqueName: \"kubernetes.io/projected/a9fa8671-f968-45fd-a5bc-fe439e771792-kube-api-access-wsw7h\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.877300 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.877367 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vv7q\" (UniqueName: \"kubernetes.io/projected/b264a369-29b1-4524-b1ef-ea0d61042e1b-kube-api-access-4vv7q\") pod \"horizon-operator-controller-manager-776f58c496-nxd4w\" (UID: \"b264a369-29b1-4524-b1ef-ea0d61042e1b\") " pod="openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.888358 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.901485 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.910763 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.919568 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"mariadb-operator-controller-manager-dockercfg-2pk4q\"" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.919782 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.970948 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7l8v\" (UniqueName: \"kubernetes.io/projected/ba537fdf-14f9-47e1-a8c6-4732c4d9dfeb-kube-api-access-m7l8v\") pod \"cinder-operator-controller-manager-6564988d95-s8gjk\" (UID: \"ba537fdf-14f9-47e1-a8c6-4732c4d9dfeb\") " pod="openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.971111 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.971598 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62x5m\" (UniqueName: \"kubernetes.io/projected/d1faa83d-1fb7-4c0a-8358-6b02b46d6c9f-kube-api-access-62x5m\") pod \"heat-operator-controller-manager-69d9587945-trclj\" (UID: \"d1faa83d-1fb7-4c0a-8358-6b02b46d6c9f\") " pod="openstack-operators/heat-operator-controller-manager-69d9587945-trclj" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.972277 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj"] Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.980058 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wddkv\" (UniqueName: \"kubernetes.io/projected/71cd8922-2260-4302-b49f-8ebb0084bc3a-kube-api-access-wddkv\") pod \"designate-operator-controller-manager-c845c877d-sh7h4\" (UID: \"71cd8922-2260-4302-b49f-8ebb0084bc3a\") " pod="openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.984574 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vv7q\" (UniqueName: \"kubernetes.io/projected/b264a369-29b1-4524-b1ef-ea0d61042e1b-kube-api-access-4vv7q\") pod \"horizon-operator-controller-manager-776f58c496-nxd4w\" (UID: \"b264a369-29b1-4524-b1ef-ea0d61042e1b\") " pod="openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.984633 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7h57\" (UniqueName: \"kubernetes.io/projected/f6a852ac-a01c-467a-96c7-d65549b557ad-kube-api-access-f7h57\") pod \"manila-operator-controller-manager-847cdc49c9-7smvt\" (UID: \"f6a852ac-a01c-467a-96c7-d65549b557ad\") " pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.984676 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94wdw\" (UniqueName: \"kubernetes.io/projected/4efa9263-cab7-4221-b570-90c929ebf82b-kube-api-access-94wdw\") pod \"ironic-operator-controller-manager-579966755f-k6ws5\" (UID: \"4efa9263-cab7-4221-b570-90c929ebf82b\") " pod="openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.984729 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljkt\" (UniqueName: \"kubernetes.io/projected/325b0a39-7766-4c7a-a5b7-c29551f18550-kube-api-access-tljkt\") pod \"keystone-operator-controller-manager-849569668d-fm84v\" (UID: \"325b0a39-7766-4c7a-a5b7-c29551f18550\") " pod="openstack-operators/keystone-operator-controller-manager-849569668d-fm84v" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.984789 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2qwv\" (UniqueName: \"kubernetes.io/projected/cab9970b-99b6-4c36-a816-4cbe9ca206f8-kube-api-access-z2qwv\") pod \"mariadb-operator-controller-manager-698d4c86bf-qhf48\" (UID: \"cab9970b-99b6-4c36-a816-4cbe9ca206f8\") " pod="openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.984831 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsw7h\" (UniqueName: \"kubernetes.io/projected/a9fa8671-f968-45fd-a5bc-fe439e771792-kube-api-access-wsw7h\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.984862 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:43 crc kubenswrapper[5184]: E0312 17:05:43.985002 5184 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 17:05:43 crc kubenswrapper[5184]: E0312 17:05:43.985064 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert podName:a9fa8671-f968-45fd-a5bc-fe439e771792 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:44.485044685 +0000 UTC m=+887.026356024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert") pod "infra-operator-controller-manager-54654cd4c7-x7588" (UID: "a9fa8671-f968-45fd-a5bc-fe439e771792") : secret "infra-operator-webhook-server-cert" not found Mar 12 17:05:43 crc kubenswrapper[5184]: I0312 17:05:43.986676 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.038166 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d9587945-trclj" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.043918 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsw7h\" (UniqueName: \"kubernetes.io/projected/a9fa8671-f968-45fd-a5bc-fe439e771792-kube-api-access-wsw7h\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.047567 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ft9v\" (UniqueName: \"kubernetes.io/projected/9aacc6d0-007b-4eff-95c0-1e6347226980-kube-api-access-2ft9v\") pod \"glance-operator-controller-manager-6f84c59bb4-9vdqk\" (UID: \"9aacc6d0-007b-4eff-95c0-1e6347226980\") " pod="openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.050022 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.055156 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzlmb\" (UniqueName: \"kubernetes.io/projected/66eb4c90-9578-461e-aaae-6385546ed865-kube-api-access-gzlmb\") pod \"barbican-operator-controller-manager-68b4f9dfcc-twj9v\" (UID: \"66eb4c90-9578-461e-aaae-6385546ed865\") " pod="openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.057871 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94wdw\" (UniqueName: \"kubernetes.io/projected/4efa9263-cab7-4221-b570-90c929ebf82b-kube-api-access-94wdw\") pod \"ironic-operator-controller-manager-579966755f-k6ws5\" (UID: \"4efa9263-cab7-4221-b570-90c929ebf82b\") " pod="openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.059796 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.061453 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vv7q\" (UniqueName: \"kubernetes.io/projected/b264a369-29b1-4524-b1ef-ea0d61042e1b-kube-api-access-4vv7q\") pod \"horizon-operator-controller-manager-776f58c496-nxd4w\" (UID: \"b264a369-29b1-4524-b1ef-ea0d61042e1b\") " pod="openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.066274 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"neutron-operator-controller-manager-dockercfg-w4fmm\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.066786 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.086743 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tljkt\" (UniqueName: \"kubernetes.io/projected/325b0a39-7766-4c7a-a5b7-c29551f18550-kube-api-access-tljkt\") pod \"keystone-operator-controller-manager-849569668d-fm84v\" (UID: \"325b0a39-7766-4c7a-a5b7-c29551f18550\") " pod="openstack-operators/keystone-operator-controller-manager-849569668d-fm84v" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.086804 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2qwv\" (UniqueName: \"kubernetes.io/projected/cab9970b-99b6-4c36-a816-4cbe9ca206f8-kube-api-access-z2qwv\") pod \"mariadb-operator-controller-manager-698d4c86bf-qhf48\" (UID: \"cab9970b-99b6-4c36-a816-4cbe9ca206f8\") " pod="openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.087123 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7h57\" (UniqueName: \"kubernetes.io/projected/f6a852ac-a01c-467a-96c7-d65549b557ad-kube-api-access-f7h57\") pod \"manila-operator-controller-manager-847cdc49c9-7smvt\" (UID: \"f6a852ac-a01c-467a-96c7-d65549b557ad\") " pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.124434 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2qwv\" (UniqueName: \"kubernetes.io/projected/cab9970b-99b6-4c36-a816-4cbe9ca206f8-kube-api-access-z2qwv\") pod \"mariadb-operator-controller-manager-698d4c86bf-qhf48\" (UID: \"cab9970b-99b6-4c36-a816-4cbe9ca206f8\") " pod="openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.125838 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.125870 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.125881 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.128762 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tljkt\" (UniqueName: \"kubernetes.io/projected/325b0a39-7766-4c7a-a5b7-c29551f18550-kube-api-access-tljkt\") pod \"keystone-operator-controller-manager-849569668d-fm84v\" (UID: \"325b0a39-7766-4c7a-a5b7-c29551f18550\") " pod="openstack-operators/keystone-operator-controller-manager-849569668d-fm84v" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.130024 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.130474 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.133206 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"nova-operator-controller-manager-dockercfg-phd7z\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.133364 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"octavia-operator-controller-manager-dockercfg-xqrtv\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.137958 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.139525 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7h57\" (UniqueName: \"kubernetes.io/projected/f6a852ac-a01c-467a-96c7-d65549b557ad-kube-api-access-f7h57\") pod \"manila-operator-controller-manager-847cdc49c9-7smvt\" (UID: \"f6a852ac-a01c-467a-96c7-d65549b557ad\") " pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.148116 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.160869 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.167949 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.171779 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-baremetal-operator-controller-manager-dockercfg-fs8x5\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.171974 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-baremetal-operator-webhook-server-cert\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.172081 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.188323 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjngs\" (UniqueName: \"kubernetes.io/projected/a149730f-e80a-4abf-8efa-29fb5820c9ae-kube-api-access-bjngs\") pod \"octavia-operator-controller-manager-8664bfd6f-t7d7f\" (UID: \"a149730f-e80a-4abf-8efa-29fb5820c9ae\") " pod="openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.188426 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6sl\" (UniqueName: \"kubernetes.io/projected/ed4f1b6b-e4af-4a56-bb94-5c48640f67ce-kube-api-access-kb6sl\") pod \"neutron-operator-controller-manager-785ff4d9b5-jvgbj\" (UID: \"ed4f1b6b-e4af-4a56-bb94-5c48640f67ce\") " pod="openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.188450 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vg2q\" (UniqueName: \"kubernetes.io/projected/6f38ee12-d676-43a1-9f44-d347f24dfbda-kube-api-access-6vg2q\") pod \"nova-operator-controller-manager-5f84d557f9-hvp27\" (UID: \"6f38ee12-d676-43a1-9f44-d347f24dfbda\") " pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.193065 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.193243 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.195977 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"ovn-operator-controller-manager-dockercfg-45jfd\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.201285 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.201621 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-849569668d-fm84v" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.219420 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.228114 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"placement-operator-controller-manager-dockercfg-ckgv6\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.228308 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.247505 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.254072 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.256219 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.262039 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.264090 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"swift-operator-controller-manager-dockercfg-k9tv4\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.266130 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.271626 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.276674 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.281799 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.284273 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"telemetry-operator-controller-manager-dockercfg-2cnmh\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.285483 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.290605 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6sl\" (UniqueName: \"kubernetes.io/projected/ed4f1b6b-e4af-4a56-bb94-5c48640f67ce-kube-api-access-kb6sl\") pod \"neutron-operator-controller-manager-785ff4d9b5-jvgbj\" (UID: \"ed4f1b6b-e4af-4a56-bb94-5c48640f67ce\") " pod="openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.290639 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vg2q\" (UniqueName: \"kubernetes.io/projected/6f38ee12-d676-43a1-9f44-d347f24dfbda-kube-api-access-6vg2q\") pod \"nova-operator-controller-manager-5f84d557f9-hvp27\" (UID: \"6f38ee12-d676-43a1-9f44-d347f24dfbda\") " pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.290687 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9svz\" (UniqueName: \"kubernetes.io/projected/c5bf50f4-0265-46ba-98ad-c8c8245664d4-kube-api-access-f9svz\") pod \"placement-operator-controller-manager-7dd8b74947-8jj2t\" (UID: \"c5bf50f4-0265-46ba-98ad-c8c8245664d4\") " pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.290708 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsxsp\" (UniqueName: \"kubernetes.io/projected/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-kube-api-access-qsxsp\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.290743 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nxgk\" (UniqueName: \"kubernetes.io/projected/4fad3169-60c8-49f1-ad8c-ec6fb4282ddc-kube-api-access-2nxgk\") pod \"ovn-operator-controller-manager-8558b89bff-8v768\" (UID: \"4fad3169-60c8-49f1-ad8c-ec6fb4282ddc\") " pod="openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.290784 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjngs\" (UniqueName: \"kubernetes.io/projected/a149730f-e80a-4abf-8efa-29fb5820c9ae-kube-api-access-bjngs\") pod \"octavia-operator-controller-manager-8664bfd6f-t7d7f\" (UID: \"a149730f-e80a-4abf-8efa-29fb5820c9ae\") " pod="openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.290816 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.298465 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-74d567479f-nrs7s"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.300869 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.303539 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-74d567479f-nrs7s"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.303690 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.313936 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjngs\" (UniqueName: \"kubernetes.io/projected/a149730f-e80a-4abf-8efa-29fb5820c9ae-kube-api-access-bjngs\") pod \"octavia-operator-controller-manager-8664bfd6f-t7d7f\" (UID: \"a149730f-e80a-4abf-8efa-29fb5820c9ae\") " pod="openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.314995 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"test-operator-controller-manager-dockercfg-mwnjg\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.324332 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6sl\" (UniqueName: \"kubernetes.io/projected/ed4f1b6b-e4af-4a56-bb94-5c48640f67ce-kube-api-access-kb6sl\") pod \"neutron-operator-controller-manager-785ff4d9b5-jvgbj\" (UID: \"ed4f1b6b-e4af-4a56-bb94-5c48640f67ce\") " pod="openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.333270 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vg2q\" (UniqueName: \"kubernetes.io/projected/6f38ee12-d676-43a1-9f44-d347f24dfbda-kube-api-access-6vg2q\") pod \"nova-operator-controller-manager-5f84d557f9-hvp27\" (UID: \"6f38ee12-d676-43a1-9f44-d347f24dfbda\") " pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.338895 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.345024 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.347745 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"watcher-operator-controller-manager-dockercfg-7fhxl\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.348476 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.363602 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.393112 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.393558 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbb5\" (UniqueName: \"kubernetes.io/projected/fcbfcf8d-d998-4e58-8218-04476a811cf1-kube-api-access-4dbb5\") pod \"swift-operator-controller-manager-865956cc65-vqfsg\" (UID: \"fcbfcf8d-d998-4e58-8218-04476a811cf1\") " pod="openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.393599 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nxgk\" (UniqueName: \"kubernetes.io/projected/4fad3169-60c8-49f1-ad8c-ec6fb4282ddc-kube-api-access-2nxgk\") pod \"ovn-operator-controller-manager-8558b89bff-8v768\" (UID: \"4fad3169-60c8-49f1-ad8c-ec6fb4282ddc\") " pod="openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.393670 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.393694 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjlvd\" (UniqueName: \"kubernetes.io/projected/be48e16f-42da-4910-81d4-1c10498247f7-kube-api-access-pjlvd\") pod \"test-operator-controller-manager-74d567479f-nrs7s\" (UID: \"be48e16f-42da-4910-81d4-1c10498247f7\") " pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.393754 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjgmv\" (UniqueName: \"kubernetes.io/projected/94ee0f08-539e-4c3c-a54a-46e35cd20e10-kube-api-access-vjgmv\") pod \"telemetry-operator-controller-manager-6cd977d774-djcpf\" (UID: \"94ee0f08-539e-4c3c-a54a-46e35cd20e10\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.393775 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9svz\" (UniqueName: \"kubernetes.io/projected/c5bf50f4-0265-46ba-98ad-c8c8245664d4-kube-api-access-f9svz\") pod \"placement-operator-controller-manager-7dd8b74947-8jj2t\" (UID: \"c5bf50f4-0265-46ba-98ad-c8c8245664d4\") " pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.393793 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsxsp\" (UniqueName: \"kubernetes.io/projected/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-kube-api-access-qsxsp\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:05:44 crc kubenswrapper[5184]: E0312 17:05:44.393984 5184 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 17:05:44 crc kubenswrapper[5184]: E0312 17:05:44.394040 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert podName:97ae1add-72a8-4cfb-8cb4-45b33d39a1b8 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:44.894024198 +0000 UTC m=+887.435335537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert") pod "openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" (UID: "97ae1add-72a8-4cfb-8cb4-45b33d39a1b8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.406847 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.411996 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"metrics-server-cert\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.412066 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"webhook-server-cert\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.412210 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-operator-controller-manager-dockercfg-vqggv\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.423138 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.433655 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9svz\" (UniqueName: \"kubernetes.io/projected/c5bf50f4-0265-46ba-98ad-c8c8245664d4-kube-api-access-f9svz\") pod \"placement-operator-controller-manager-7dd8b74947-8jj2t\" (UID: \"c5bf50f4-0265-46ba-98ad-c8c8245664d4\") " pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.434586 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nxgk\" (UniqueName: \"kubernetes.io/projected/4fad3169-60c8-49f1-ad8c-ec6fb4282ddc-kube-api-access-2nxgk\") pod \"ovn-operator-controller-manager-8558b89bff-8v768\" (UID: \"4fad3169-60c8-49f1-ad8c-ec6fb4282ddc\") " pod="openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.439504 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsxsp\" (UniqueName: \"kubernetes.io/projected/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-kube-api-access-qsxsp\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.461347 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.463004 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.463057 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.477624 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.477771 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.480455 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.493488 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"rabbitmq-cluster-operator-controller-manager-dockercfg-b2gk5\"" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.503102 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp22p\" (UniqueName: \"kubernetes.io/projected/8e771b12-3698-427d-a93a-2293244e2171-kube-api-access-mp22p\") pod \"watcher-operator-controller-manager-688f7d67f5-ggjlw\" (UID: \"8e771b12-3698-427d-a93a-2293244e2171\") " pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.503146 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbb5\" (UniqueName: \"kubernetes.io/projected/fcbfcf8d-d998-4e58-8218-04476a811cf1-kube-api-access-4dbb5\") pod \"swift-operator-controller-manager-865956cc65-vqfsg\" (UID: \"fcbfcf8d-d998-4e58-8218-04476a811cf1\") " pod="openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.503259 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.503341 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjlvd\" (UniqueName: \"kubernetes.io/projected/be48e16f-42da-4910-81d4-1c10498247f7-kube-api-access-pjlvd\") pod \"test-operator-controller-manager-74d567479f-nrs7s\" (UID: \"be48e16f-42da-4910-81d4-1c10498247f7\") " pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.503417 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhmm4\" (UniqueName: \"kubernetes.io/projected/8edf3dec-1386-4665-8a1a-ac779905f180-kube-api-access-xhmm4\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.503450 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.503521 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.503550 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjgmv\" (UniqueName: \"kubernetes.io/projected/94ee0f08-539e-4c3c-a54a-46e35cd20e10-kube-api-access-vjgmv\") pod \"telemetry-operator-controller-manager-6cd977d774-djcpf\" (UID: \"94ee0f08-539e-4c3c-a54a-46e35cd20e10\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" Mar 12 17:05:44 crc kubenswrapper[5184]: E0312 17:05:44.504949 5184 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 17:05:44 crc kubenswrapper[5184]: E0312 17:05:44.504995 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert podName:a9fa8671-f968-45fd-a5bc-fe439e771792 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:45.504981334 +0000 UTC m=+888.046292673 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert") pod "infra-operator-controller-manager-54654cd4c7-x7588" (UID: "a9fa8671-f968-45fd-a5bc-fe439e771792") : secret "infra-operator-webhook-server-cert" not found Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.519740 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.527742 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbb5\" (UniqueName: \"kubernetes.io/projected/fcbfcf8d-d998-4e58-8218-04476a811cf1-kube-api-access-4dbb5\") pod \"swift-operator-controller-manager-865956cc65-vqfsg\" (UID: \"fcbfcf8d-d998-4e58-8218-04476a811cf1\") " pod="openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.528437 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjlvd\" (UniqueName: \"kubernetes.io/projected/be48e16f-42da-4910-81d4-1c10498247f7-kube-api-access-pjlvd\") pod \"test-operator-controller-manager-74d567479f-nrs7s\" (UID: \"be48e16f-42da-4910-81d4-1c10498247f7\") " pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.538147 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjgmv\" (UniqueName: \"kubernetes.io/projected/94ee0f08-539e-4c3c-a54a-46e35cd20e10-kube-api-access-vjgmv\") pod \"telemetry-operator-controller-manager-6cd977d774-djcpf\" (UID: \"94ee0f08-539e-4c3c-a54a-46e35cd20e10\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.555230 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.605286 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.605401 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8jct\" (UniqueName: \"kubernetes.io/projected/5717f5e6-3e6e-4585-b9a6-bcc31f707080-kube-api-access-g8jct\") pod \"rabbitmq-cluster-operator-manager-85d9b55b6-v89lf\" (UID: \"5717f5e6-3e6e-4585-b9a6-bcc31f707080\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.605456 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhmm4\" (UniqueName: \"kubernetes.io/projected/8edf3dec-1386-4665-8a1a-ac779905f180-kube-api-access-xhmm4\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:44 crc kubenswrapper[5184]: E0312 17:05:44.605495 5184 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.605535 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:44 crc kubenswrapper[5184]: E0312 17:05:44.605578 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:45.105555792 +0000 UTC m=+887.646867131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "metrics-server-cert" not found Mar 12 17:05:44 crc kubenswrapper[5184]: E0312 17:05:44.605657 5184 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.605672 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mp22p\" (UniqueName: \"kubernetes.io/projected/8e771b12-3698-427d-a93a-2293244e2171-kube-api-access-mp22p\") pod \"watcher-operator-controller-manager-688f7d67f5-ggjlw\" (UID: \"8e771b12-3698-427d-a93a-2293244e2171\") " pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" Mar 12 17:05:44 crc kubenswrapper[5184]: E0312 17:05:44.605722 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:45.105702377 +0000 UTC m=+887.647013806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "webhook-server-cert" not found Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.624798 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp22p\" (UniqueName: \"kubernetes.io/projected/8e771b12-3698-427d-a93a-2293244e2171-kube-api-access-mp22p\") pod \"watcher-operator-controller-manager-688f7d67f5-ggjlw\" (UID: \"8e771b12-3698-427d-a93a-2293244e2171\") " pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.625411 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhmm4\" (UniqueName: \"kubernetes.io/projected/8edf3dec-1386-4665-8a1a-ac779905f180-kube-api-access-xhmm4\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.706923 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8jct\" (UniqueName: \"kubernetes.io/projected/5717f5e6-3e6e-4585-b9a6-bcc31f707080-kube-api-access-g8jct\") pod \"rabbitmq-cluster-operator-manager-85d9b55b6-v89lf\" (UID: \"5717f5e6-3e6e-4585-b9a6-bcc31f707080\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.735646 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8jct\" (UniqueName: \"kubernetes.io/projected/5717f5e6-3e6e-4585-b9a6-bcc31f707080-kube-api-access-g8jct\") pod \"rabbitmq-cluster-operator-manager-85d9b55b6-v89lf\" (UID: \"5717f5e6-3e6e-4585-b9a6-bcc31f707080\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.757648 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.814170 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.814343 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.824394 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.839613 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.863738 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.911151 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:05:44 crc kubenswrapper[5184]: E0312 17:05:44.911339 5184 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 17:05:44 crc kubenswrapper[5184]: E0312 17:05:44.911416 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert podName:97ae1add-72a8-4cfb-8cb4-45b33d39a1b8 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:45.911398366 +0000 UTC m=+888.452709705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert") pod "openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" (UID: "97ae1add-72a8-4cfb-8cb4-45b33d39a1b8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.939613 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk"] Mar 12 17:05:44 crc kubenswrapper[5184]: I0312 17:05:44.950248 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d9587945-trclj"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.105678 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-849569668d-fm84v"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.113029 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.114172 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.114299 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.114472 5184 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.114549 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:46.114527665 +0000 UTC m=+888.655839004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "metrics-server-cert" not found Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.114472 5184 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.115341 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:46.11532727 +0000 UTC m=+888.656638609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "webhook-server-cert" not found Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.118453 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v"] Mar 12 17:05:45 crc kubenswrapper[5184]: W0312 17:05:45.137317 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66eb4c90_9578_461e_aaae_6385546ed865.slice/crio-668903fc2491020e01f780d3a2a21ee614196d6304fd185e67686b2d56e344d3 WatchSource:0}: Error finding container 668903fc2491020e01f780d3a2a21ee614196d6304fd185e67686b2d56e344d3: Status 404 returned error can't find the container with id 668903fc2491020e01f780d3a2a21ee614196d6304fd185e67686b2d56e344d3 Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.413722 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5" event={"ID":"4efa9263-cab7-4221-b570-90c929ebf82b","Type":"ContainerStarted","Data":"855a9c66303a2f6a24404aa5a1f3c9fa875f850ee0dd140f39291c48248c6dc0"} Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.415636 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk" event={"ID":"ba537fdf-14f9-47e1-a8c6-4732c4d9dfeb","Type":"ContainerStarted","Data":"897bc8d629a095454d3d7b0c2268f4f0f826928e23ff1b6c15460cd340cdb393"} Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.417704 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v" event={"ID":"66eb4c90-9578-461e-aaae-6385546ed865","Type":"ContainerStarted","Data":"668903fc2491020e01f780d3a2a21ee614196d6304fd185e67686b2d56e344d3"} Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.419719 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w" event={"ID":"b264a369-29b1-4524-b1ef-ea0d61042e1b","Type":"ContainerStarted","Data":"18db5ffff27b6799b5723043e5cced83fe638784699d64d05a397a076b44789b"} Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.422191 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d9587945-trclj" event={"ID":"d1faa83d-1fb7-4c0a-8358-6b02b46d6c9f","Type":"ContainerStarted","Data":"bb60791c00354f49c1adb6f35adf20fac239cdcc8046980e8648a11e9ddb937c"} Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.423605 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-849569668d-fm84v" event={"ID":"325b0a39-7766-4c7a-a5b7-c29551f18550","Type":"ContainerStarted","Data":"a6acb5e965df15261527b190f61fef6e6f495c304972f1e7f00a10041e1a2585"} Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.478468 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.499832 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.506315 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.515916 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.523447 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.526991 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.529534 5184 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.529680 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert podName:a9fa8671-f968-45fd-a5bc-fe439e771792 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:47.529630992 +0000 UTC m=+890.070942351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert") pod "infra-operator-controller-manager-54654cd4c7-x7588" (UID: "a9fa8671-f968-45fd-a5bc-fe439e771792") : secret "infra-operator-webhook-server-cert" not found Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.532684 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.540911 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.547154 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk"] Mar 12 17:05:45 crc kubenswrapper[5184]: W0312 17:05:45.553536 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aacc6d0_007b_4eff_95c0_1e6347226980.slice/crio-aeb5d8d50a9f92d62f56640436184c84f092dea7b46e7c4cd82591dbb98c04e7 WatchSource:0}: Error finding container aeb5d8d50a9f92d62f56640436184c84f092dea7b46e7c4cd82591dbb98c04e7: Status 404 returned error can't find the container with id aeb5d8d50a9f92d62f56640436184c84f092dea7b46e7c4cd82591dbb98c04e7 Mar 12 17:05:45 crc kubenswrapper[5184]: W0312 17:05:45.567712 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f38ee12_d676_43a1_9f44_d347f24dfbda.slice/crio-635f6e828edffba96ccc4cafea1e3e862659ecf2961b492eaac57b3ade35207b WatchSource:0}: Error finding container 635f6e828edffba96ccc4cafea1e3e862659ecf2961b492eaac57b3ade35207b: Status 404 returned error can't find the container with id 635f6e828edffba96ccc4cafea1e3e862659ecf2961b492eaac57b3ade35207b Mar 12 17:05:45 crc kubenswrapper[5184]: W0312 17:05:45.569977 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6a852ac_a01c_467a_96c7_d65549b557ad.slice/crio-e930a418ed2b4dfc3d0bb4fefd3b9ab32545570ce3dcc8472abb33f5f4eb04b6 WatchSource:0}: Error finding container e930a418ed2b4dfc3d0bb4fefd3b9ab32545570ce3dcc8472abb33f5f4eb04b6: Status 404 returned error can't find the container with id e930a418ed2b4dfc3d0bb4fefd3b9ab32545570ce3dcc8472abb33f5f4eb04b6 Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.571093 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6vg2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5f84d557f9-hvp27_openstack-operators(6f38ee12-d676-43a1-9f44-d347f24dfbda): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.572249 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" podUID="6f38ee12-d676-43a1-9f44-d347f24dfbda" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.573328 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f7h57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-847cdc49c9-7smvt_openstack-operators(f6a852ac-a01c-467a-96c7-d65549b557ad): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.574424 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" podUID="f6a852ac-a01c-467a-96c7-d65549b557ad" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.575354 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pjlvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-74d567479f-nrs7s_openstack-operators(be48e16f-42da-4910-81d4-1c10498247f7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.576637 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" podUID="be48e16f-42da-4910-81d4-1c10498247f7" Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.576775 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27"] Mar 12 17:05:45 crc kubenswrapper[5184]: W0312 17:05:45.580172 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5bf50f4_0265_46ba_98ad_c8c8245664d4.slice/crio-56e6d014545f51c53c4d8b457646660c1bd31f27dc7487af33ad21c87d90436f WatchSource:0}: Error finding container 56e6d014545f51c53c4d8b457646660c1bd31f27dc7487af33ad21c87d90436f: Status 404 returned error can't find the container with id 56e6d014545f51c53c4d8b457646660c1bd31f27dc7487af33ad21c87d90436f Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.589218 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t"] Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.591660 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g8jct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-85d9b55b6-v89lf_openstack-operators(5717f5e6-3e6e-4585-b9a6-bcc31f707080): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.591779 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f9svz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-7dd8b74947-8jj2t_openstack-operators(c5bf50f4-0265-46ba-98ad-c8c8245664d4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.591972 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mp22p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-688f7d67f5-ggjlw_openstack-operators(8e771b12-3698-427d-a93a-2293244e2171): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.592620 5184 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjgmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd977d774-djcpf_openstack-operators(94ee0f08-539e-4c3c-a54a-46e35cd20e10): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.592814 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" podUID="5717f5e6-3e6e-4585-b9a6-bcc31f707080" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.593130 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" podUID="c5bf50f4-0265-46ba-98ad-c8c8245664d4" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.593183 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" podUID="8e771b12-3698-427d-a93a-2293244e2171" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.593914 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" podUID="94ee0f08-539e-4c3c-a54a-46e35cd20e10" Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.596560 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.604331 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.610948 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-74d567479f-nrs7s"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.617068 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf"] Mar 12 17:05:45 crc kubenswrapper[5184]: I0312 17:05:45.944477 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.944936 5184 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 17:05:45 crc kubenswrapper[5184]: E0312 17:05:45.945049 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert podName:97ae1add-72a8-4cfb-8cb4-45b33d39a1b8 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:47.945021047 +0000 UTC m=+890.486332386 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert") pod "openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" (UID: "97ae1add-72a8-4cfb-8cb4-45b33d39a1b8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.148024 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.148141 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.148292 5184 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.148366 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:48.148348152 +0000 UTC m=+890.689659491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "metrics-server-cert" not found Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.148292 5184 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.148449 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:48.148438225 +0000 UTC m=+890.689749564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "webhook-server-cert" not found Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.444696 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" event={"ID":"94ee0f08-539e-4c3c-a54a-46e35cd20e10","Type":"ContainerStarted","Data":"016d55825934cf593607ffab60d643bdcc65046f6d2eebce7778cceff4166253"} Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.449974 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768" event={"ID":"4fad3169-60c8-49f1-ad8c-ec6fb4282ddc","Type":"ContainerStarted","Data":"74a6ec8d70b524b23b00e55067affe4402a25cc732583805f436e4a93eaf8563"} Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.450298 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" podUID="94ee0f08-539e-4c3c-a54a-46e35cd20e10" Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.451750 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj" event={"ID":"ed4f1b6b-e4af-4a56-bb94-5c48640f67ce","Type":"ContainerStarted","Data":"aaeae3749c4abdcf429ed567b4d4350a2f374b369c130e4e89cc12a4a868ad0b"} Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.452759 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48" event={"ID":"cab9970b-99b6-4c36-a816-4cbe9ca206f8","Type":"ContainerStarted","Data":"ed5adc6937a8b326227df340f69fa25a47307a57b44dd4403961670d751f4a95"} Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.454638 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" event={"ID":"be48e16f-42da-4910-81d4-1c10498247f7","Type":"ContainerStarted","Data":"e8dc244bac705e24d86d8b014f109d1b897138a9137ceb1c0c0030b223d67a14"} Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.460431 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" podUID="be48e16f-42da-4910-81d4-1c10498247f7" Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.461907 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4" event={"ID":"71cd8922-2260-4302-b49f-8ebb0084bc3a","Type":"ContainerStarted","Data":"35d2010c0c55d6c7a78f95878629d9fce4a87bdbfd2b5b8272ab0b4f3d8e0455"} Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.473741 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk" event={"ID":"9aacc6d0-007b-4eff-95c0-1e6347226980","Type":"ContainerStarted","Data":"aeb5d8d50a9f92d62f56640436184c84f092dea7b46e7c4cd82591dbb98c04e7"} Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.477822 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" event={"ID":"5717f5e6-3e6e-4585-b9a6-bcc31f707080","Type":"ContainerStarted","Data":"fdd58e6551c6481b37605ef0f7a418f00b21e65001bda87d33156eec057c812f"} Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.480735 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" podUID="5717f5e6-3e6e-4585-b9a6-bcc31f707080" Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.485968 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" event={"ID":"f6a852ac-a01c-467a-96c7-d65549b557ad","Type":"ContainerStarted","Data":"e930a418ed2b4dfc3d0bb4fefd3b9ab32545570ce3dcc8472abb33f5f4eb04b6"} Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.489916 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" podUID="f6a852ac-a01c-467a-96c7-d65549b557ad" Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.492243 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" event={"ID":"8e771b12-3698-427d-a93a-2293244e2171","Type":"ContainerStarted","Data":"d4b2affc5dafbdeb97737a13eb8c70a3b4c85f61d5251c94410336fdc259b3dd"} Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.496919 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" podUID="8e771b12-3698-427d-a93a-2293244e2171" Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.499171 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg" event={"ID":"fcbfcf8d-d998-4e58-8218-04476a811cf1","Type":"ContainerStarted","Data":"a8c2f283b6857b97273ef68c7793f53e19de337fb092912f821b9ce874b38d78"} Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.518087 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f" event={"ID":"a149730f-e80a-4abf-8efa-29fb5820c9ae","Type":"ContainerStarted","Data":"15eb95f88bb551c77274c1a6569db25009388d061f74d7fca08ce12f8299ee3f"} Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.521809 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" event={"ID":"6f38ee12-d676-43a1-9f44-d347f24dfbda","Type":"ContainerStarted","Data":"635f6e828edffba96ccc4cafea1e3e862659ecf2961b492eaac57b3ade35207b"} Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.523498 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" podUID="6f38ee12-d676-43a1-9f44-d347f24dfbda" Mar 12 17:05:46 crc kubenswrapper[5184]: I0312 17:05:46.526213 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" event={"ID":"c5bf50f4-0265-46ba-98ad-c8c8245664d4","Type":"ContainerStarted","Data":"56e6d014545f51c53c4d8b457646660c1bd31f27dc7487af33ad21c87d90436f"} Mar 12 17:05:46 crc kubenswrapper[5184]: E0312 17:05:46.529686 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" podUID="c5bf50f4-0265-46ba-98ad-c8c8245664d4" Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.539001 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" podUID="5717f5e6-3e6e-4585-b9a6-bcc31f707080" Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.539289 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" podUID="94ee0f08-539e-4c3c-a54a-46e35cd20e10" Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.539345 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" podUID="be48e16f-42da-4910-81d4-1c10498247f7" Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.540224 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" podUID="6f38ee12-d676-43a1-9f44-d347f24dfbda" Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.540261 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" podUID="c5bf50f4-0265-46ba-98ad-c8c8245664d4" Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.540354 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" podUID="8e771b12-3698-427d-a93a-2293244e2171" Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.542847 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\": ErrImagePull: pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" podUID="f6a852ac-a01c-467a-96c7-d65549b557ad" Mar 12 17:05:47 crc kubenswrapper[5184]: I0312 17:05:47.569246 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.569469 5184 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.569579 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert podName:a9fa8671-f968-45fd-a5bc-fe439e771792 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:51.569559219 +0000 UTC m=+894.110870558 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert") pod "infra-operator-controller-manager-54654cd4c7-x7588" (UID: "a9fa8671-f968-45fd-a5bc-fe439e771792") : secret "infra-operator-webhook-server-cert" not found Mar 12 17:05:47 crc kubenswrapper[5184]: I0312 17:05:47.975149 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.975308 5184 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 17:05:47 crc kubenswrapper[5184]: E0312 17:05:47.975358 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert podName:97ae1add-72a8-4cfb-8cb4-45b33d39a1b8 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:51.975344141 +0000 UTC m=+894.516655480 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert") pod "openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" (UID: "97ae1add-72a8-4cfb-8cb4-45b33d39a1b8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 17:05:48 crc kubenswrapper[5184]: I0312 17:05:48.177541 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:48 crc kubenswrapper[5184]: I0312 17:05:48.177689 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:48 crc kubenswrapper[5184]: E0312 17:05:48.177780 5184 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 17:05:48 crc kubenswrapper[5184]: E0312 17:05:48.177827 5184 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 17:05:48 crc kubenswrapper[5184]: E0312 17:05:48.177875 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:52.177851661 +0000 UTC m=+894.719163070 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "metrics-server-cert" not found Mar 12 17:05:48 crc kubenswrapper[5184]: E0312 17:05:48.177901 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:52.177890582 +0000 UTC m=+894.719202041 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "webhook-server-cert" not found Mar 12 17:05:48 crc kubenswrapper[5184]: E0312 17:05:48.781196 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1122677 actualBytes=10240 Mar 12 17:05:50 crc kubenswrapper[5184]: I0312 17:05:50.742735 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:05:50 crc kubenswrapper[5184]: I0312 17:05:50.742959 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:05:51 crc kubenswrapper[5184]: I0312 17:05:51.638362 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:51 crc kubenswrapper[5184]: E0312 17:05:51.638537 5184 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 17:05:51 crc kubenswrapper[5184]: E0312 17:05:51.638625 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert podName:a9fa8671-f968-45fd-a5bc-fe439e771792 nodeName:}" failed. No retries permitted until 2026-03-12 17:05:59.638600334 +0000 UTC m=+902.179911673 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert") pod "infra-operator-controller-manager-54654cd4c7-x7588" (UID: "a9fa8671-f968-45fd-a5bc-fe439e771792") : secret "infra-operator-webhook-server-cert" not found Mar 12 17:05:52 crc kubenswrapper[5184]: I0312 17:05:52.048092 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:05:52 crc kubenswrapper[5184]: E0312 17:05:52.048253 5184 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 17:05:52 crc kubenswrapper[5184]: E0312 17:05:52.048339 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert podName:97ae1add-72a8-4cfb-8cb4-45b33d39a1b8 nodeName:}" failed. No retries permitted until 2026-03-12 17:06:00.04832134 +0000 UTC m=+902.589632679 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert") pod "openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" (UID: "97ae1add-72a8-4cfb-8cb4-45b33d39a1b8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 17:05:52 crc kubenswrapper[5184]: I0312 17:05:52.251570 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:52 crc kubenswrapper[5184]: I0312 17:05:52.251665 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:05:52 crc kubenswrapper[5184]: E0312 17:05:52.251732 5184 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 17:05:52 crc kubenswrapper[5184]: E0312 17:05:52.251805 5184 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 17:05:52 crc kubenswrapper[5184]: E0312 17:05:52.251829 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:06:00.25181064 +0000 UTC m=+902.793121979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "webhook-server-cert" not found Mar 12 17:05:52 crc kubenswrapper[5184]: E0312 17:05:52.251854 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:06:00.251841321 +0000 UTC m=+902.793152730 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "metrics-server-cert" not found Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.606224 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk" event={"ID":"ba537fdf-14f9-47e1-a8c6-4732c4d9dfeb","Type":"ContainerStarted","Data":"ee3c2cbeb575c1a74bbbe11dc66446b1a1eb948d803cc87880247265b568d42f"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.608755 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.610896 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg" event={"ID":"fcbfcf8d-d998-4e58-8218-04476a811cf1","Type":"ContainerStarted","Data":"d9891e9184f1b490e74a04ac95041852d7939ca2570d56fce9e57b48b416f19e"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.611400 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.612633 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v" event={"ID":"66eb4c90-9578-461e-aaae-6385546ed865","Type":"ContainerStarted","Data":"77bf5f759503ca6143989211f7a471ef31e6a36b4fe43129cc6cd97731040bbf"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.613051 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.614767 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w" event={"ID":"b264a369-29b1-4524-b1ef-ea0d61042e1b","Type":"ContainerStarted","Data":"f14f8ae940bd0385f017484114496e70ae8e623b660b2cbcd287467b10654e11"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.614980 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.616441 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f" event={"ID":"a149730f-e80a-4abf-8efa-29fb5820c9ae","Type":"ContainerStarted","Data":"f5f9e5c1249cb2be6574226cf08811b4e50c9c4a3980808131cdb31fb09d5db9"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.616751 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.618728 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-849569668d-fm84v" event={"ID":"325b0a39-7766-4c7a-a5b7-c29551f18550","Type":"ContainerStarted","Data":"97d5737881772d43060a10025284c61406522aa4b93e347d6cae621c03d95748"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.618897 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/keystone-operator-controller-manager-849569668d-fm84v" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.621209 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768" event={"ID":"4fad3169-60c8-49f1-ad8c-ec6fb4282ddc","Type":"ContainerStarted","Data":"d020c60eab6511314759c10610ed956331c5f8004c2282c7adcfaf9961ce0b65"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.621659 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.623549 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj" event={"ID":"ed4f1b6b-e4af-4a56-bb94-5c48640f67ce","Type":"ContainerStarted","Data":"80520cb899feea80a8197ee3de4e0b5cc4792fd390ab7ac0f1308b8b95960552"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.623801 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.625129 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5" event={"ID":"4efa9263-cab7-4221-b570-90c929ebf82b","Type":"ContainerStarted","Data":"3b16b24544949a110b68718acb3c5b1a2fa2f753e291fd7931368eb2d1a3076b"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.625268 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.626979 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48" event={"ID":"cab9970b-99b6-4c36-a816-4cbe9ca206f8","Type":"ContainerStarted","Data":"7dde089ce517f6fa7f1e06ef93f7c9143cf923915b2e457e200270aaaa46f997"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.627122 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.628163 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4" event={"ID":"71cd8922-2260-4302-b49f-8ebb0084bc3a","Type":"ContainerStarted","Data":"1036edaf428635eaac62dd15351d8fd041a8495ee3b9359acbd684e9a5b0a797"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.628489 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.629696 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk" event={"ID":"9aacc6d0-007b-4eff-95c0-1e6347226980","Type":"ContainerStarted","Data":"81a07eee06bc6bb311021637bd4afc24a0295785dd5ba4fbf4c8a21b87129b04"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.630057 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.634967 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d9587945-trclj" event={"ID":"d1faa83d-1fb7-4c0a-8358-6b02b46d6c9f","Type":"ContainerStarted","Data":"3b55c76ac8fb97322f9a929b42e0054f85a7900eb6550aab4b2d42fa5adcb42e"} Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.635193 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/heat-operator-controller-manager-69d9587945-trclj" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.678093 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk" podStartSLOduration=3.413272213 podStartE2EDuration="14.67806462s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:44.975641377 +0000 UTC m=+887.516952716" lastFinishedPulling="2026-03-12 17:05:56.240433784 +0000 UTC m=+898.781745123" observedRunningTime="2026-03-12 17:05:57.654093633 +0000 UTC m=+900.195404992" watchObservedRunningTime="2026-03-12 17:05:57.67806462 +0000 UTC m=+900.219375959" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.682317 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj" podStartSLOduration=3.998926639 podStartE2EDuration="14.682303695s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.541242968 +0000 UTC m=+888.082554307" lastFinishedPulling="2026-03-12 17:05:56.224620004 +0000 UTC m=+898.765931363" observedRunningTime="2026-03-12 17:05:57.675683295 +0000 UTC m=+900.216994634" watchObservedRunningTime="2026-03-12 17:05:57.682303695 +0000 UTC m=+900.223615034" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.701590 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk" podStartSLOduration=4.072847233 podStartE2EDuration="14.701574103s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.556243652 +0000 UTC m=+888.097554991" lastFinishedPulling="2026-03-12 17:05:56.184970532 +0000 UTC m=+898.726281861" observedRunningTime="2026-03-12 17:05:57.699727745 +0000 UTC m=+900.241039084" watchObservedRunningTime="2026-03-12 17:05:57.701574103 +0000 UTC m=+900.242885442" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.722012 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f" podStartSLOduration=4.08601675 podStartE2EDuration="14.721993429s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.554338782 +0000 UTC m=+888.095650121" lastFinishedPulling="2026-03-12 17:05:56.190315461 +0000 UTC m=+898.731626800" observedRunningTime="2026-03-12 17:05:57.716083612 +0000 UTC m=+900.257394951" watchObservedRunningTime="2026-03-12 17:05:57.721993429 +0000 UTC m=+900.263304768" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.756631 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d9587945-trclj" podStartSLOduration=3.547839765 podStartE2EDuration="14.756614492s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:44.989282127 +0000 UTC m=+887.530593467" lastFinishedPulling="2026-03-12 17:05:56.198056855 +0000 UTC m=+898.739368194" observedRunningTime="2026-03-12 17:05:57.752338188 +0000 UTC m=+900.293649527" watchObservedRunningTime="2026-03-12 17:05:57.756614492 +0000 UTC m=+900.297925831" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.808415 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v" podStartSLOduration=3.726891383 podStartE2EDuration="14.808395999s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.140488335 +0000 UTC m=+887.681799674" lastFinishedPulling="2026-03-12 17:05:56.221992951 +0000 UTC m=+898.763304290" observedRunningTime="2026-03-12 17:05:57.804527847 +0000 UTC m=+900.345839186" watchObservedRunningTime="2026-03-12 17:05:57.808395999 +0000 UTC m=+900.349707338" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.808522 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w" podStartSLOduration=3.503380281 podStartE2EDuration="14.808517753s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:44.873091296 +0000 UTC m=+887.414402635" lastFinishedPulling="2026-03-12 17:05:56.178228768 +0000 UTC m=+898.719540107" observedRunningTime="2026-03-12 17:05:57.776221112 +0000 UTC m=+900.317532451" watchObservedRunningTime="2026-03-12 17:05:57.808517753 +0000 UTC m=+900.349829092" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.834798 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768" podStartSLOduration=4.138495199 podStartE2EDuration="14.834779242s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.52324383 +0000 UTC m=+888.064555169" lastFinishedPulling="2026-03-12 17:05:56.219527883 +0000 UTC m=+898.760839212" observedRunningTime="2026-03-12 17:05:57.828030979 +0000 UTC m=+900.369342318" watchObservedRunningTime="2026-03-12 17:05:57.834779242 +0000 UTC m=+900.376090581" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.859479 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-849569668d-fm84v" podStartSLOduration=3.713100808 podStartE2EDuration="14.859461132s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.11846347 +0000 UTC m=+887.659774809" lastFinishedPulling="2026-03-12 17:05:56.264823774 +0000 UTC m=+898.806135133" observedRunningTime="2026-03-12 17:05:57.855200648 +0000 UTC m=+900.396511987" watchObservedRunningTime="2026-03-12 17:05:57.859461132 +0000 UTC m=+900.400772471" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.880502 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5" podStartSLOduration=3.791569976 podStartE2EDuration="14.880484817s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.135106095 +0000 UTC m=+887.676417434" lastFinishedPulling="2026-03-12 17:05:56.224020946 +0000 UTC m=+898.765332275" observedRunningTime="2026-03-12 17:05:57.873243348 +0000 UTC m=+900.414554687" watchObservedRunningTime="2026-03-12 17:05:57.880484817 +0000 UTC m=+900.421796156" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.890804 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg" podStartSLOduration=4.246109309 podStartE2EDuration="14.890787892s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.553392322 +0000 UTC m=+888.094703651" lastFinishedPulling="2026-03-12 17:05:56.198070885 +0000 UTC m=+898.739382234" observedRunningTime="2026-03-12 17:05:57.887921921 +0000 UTC m=+900.429233250" watchObservedRunningTime="2026-03-12 17:05:57.890787892 +0000 UTC m=+900.432099231" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.907696 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4" podStartSLOduration=4.213062985 podStartE2EDuration="14.907680546s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.523668133 +0000 UTC m=+888.064979472" lastFinishedPulling="2026-03-12 17:05:56.218285694 +0000 UTC m=+898.759597033" observedRunningTime="2026-03-12 17:05:57.905546158 +0000 UTC m=+900.446857497" watchObservedRunningTime="2026-03-12 17:05:57.907680546 +0000 UTC m=+900.448991885" Mar 12 17:05:57 crc kubenswrapper[5184]: I0312 17:05:57.931132 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48" podStartSLOduration=4.193095044 podStartE2EDuration="14.931112676s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.500556803 +0000 UTC m=+888.041868142" lastFinishedPulling="2026-03-12 17:05:56.238574435 +0000 UTC m=+898.779885774" observedRunningTime="2026-03-12 17:05:57.929691421 +0000 UTC m=+900.471002760" watchObservedRunningTime="2026-03-12 17:05:57.931112676 +0000 UTC m=+900.472424015" Mar 12 17:05:58 crc kubenswrapper[5184]: I0312 17:05:58.831758 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:05:58 crc kubenswrapper[5184]: I0312 17:05:58.838939 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:05:58 crc kubenswrapper[5184]: I0312 17:05:58.841126 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:05:58 crc kubenswrapper[5184]: I0312 17:05:58.847394 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:05:59 crc kubenswrapper[5184]: I0312 17:05:59.684904 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:59 crc kubenswrapper[5184]: I0312 17:05:59.691837 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9fa8671-f968-45fd-a5bc-fe439e771792-cert\") pod \"infra-operator-controller-manager-54654cd4c7-x7588\" (UID: \"a9fa8671-f968-45fd-a5bc-fe439e771792\") " pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:05:59 crc kubenswrapper[5184]: I0312 17:05:59.704534 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"infra-operator-controller-manager-dockercfg-6pxpn\"" Mar 12 17:05:59 crc kubenswrapper[5184]: I0312 17:05:59.712876 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.092797 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.104813 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97ae1add-72a8-4cfb-8cb4-45b33d39a1b8-cert\") pod \"openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n\" (UID: \"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.132106 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555586-7mzjp"] Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.150883 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555586-7mzjp"] Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.151050 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555586-7mzjp" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.154000 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.154142 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.155233 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.200920 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588"] Mar 12 17:06:00 crc kubenswrapper[5184]: W0312 17:06:00.204628 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9fa8671_f968_45fd_a5bc_fe439e771792.slice/crio-3ba88e28f1d962c74d7339bdc6ddd03f85b61384022cc0aacde9b6640dc4a819 WatchSource:0}: Error finding container 3ba88e28f1d962c74d7339bdc6ddd03f85b61384022cc0aacde9b6640dc4a819: Status 404 returned error can't find the container with id 3ba88e28f1d962c74d7339bdc6ddd03f85b61384022cc0aacde9b6640dc4a819 Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.300096 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.300242 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.300322 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xszq6\" (UniqueName: \"kubernetes.io/projected/3b98b237-d23d-424a-a15c-ec59149aba58-kube-api-access-xszq6\") pod \"auto-csr-approver-29555586-7mzjp\" (UID: \"3b98b237-d23d-424a-a15c-ec59149aba58\") " pod="openshift-infra/auto-csr-approver-29555586-7mzjp" Mar 12 17:06:00 crc kubenswrapper[5184]: E0312 17:06:00.300574 5184 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 17:06:00 crc kubenswrapper[5184]: E0312 17:06:00.300660 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs podName:8edf3dec-1386-4665-8a1a-ac779905f180 nodeName:}" failed. No retries permitted until 2026-03-12 17:06:16.300636479 +0000 UTC m=+918.841947818 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs") pod "openstack-operator-controller-manager-58ddd4554c-m4npf" (UID: "8edf3dec-1386-4665-8a1a-ac779905f180") : secret "webhook-server-cert" not found Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.306399 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-metrics-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.398147 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-baremetal-operator-controller-manager-dockercfg-fs8x5\"" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.402588 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xszq6\" (UniqueName: \"kubernetes.io/projected/3b98b237-d23d-424a-a15c-ec59149aba58-kube-api-access-xszq6\") pod \"auto-csr-approver-29555586-7mzjp\" (UID: \"3b98b237-d23d-424a-a15c-ec59149aba58\") " pod="openshift-infra/auto-csr-approver-29555586-7mzjp" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.406577 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.424969 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xszq6\" (UniqueName: \"kubernetes.io/projected/3b98b237-d23d-424a-a15c-ec59149aba58-kube-api-access-xszq6\") pod \"auto-csr-approver-29555586-7mzjp\" (UID: \"3b98b237-d23d-424a-a15c-ec59149aba58\") " pod="openshift-infra/auto-csr-approver-29555586-7mzjp" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.477980 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555586-7mzjp" Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.660903 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" event={"ID":"a9fa8671-f968-45fd-a5bc-fe439e771792","Type":"ContainerStarted","Data":"3ba88e28f1d962c74d7339bdc6ddd03f85b61384022cc0aacde9b6640dc4a819"} Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.875864 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n"] Mar 12 17:06:00 crc kubenswrapper[5184]: W0312 17:06:00.886473 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ae1add_72a8_4cfb_8cb4_45b33d39a1b8.slice/crio-dfac490be5542aa107ef60610c105a7cef3a4b8f3d58f6e4d37ed90a376de1d0 WatchSource:0}: Error finding container dfac490be5542aa107ef60610c105a7cef3a4b8f3d58f6e4d37ed90a376de1d0: Status 404 returned error can't find the container with id dfac490be5542aa107ef60610c105a7cef3a4b8f3d58f6e4d37ed90a376de1d0 Mar 12 17:06:00 crc kubenswrapper[5184]: I0312 17:06:00.934237 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555586-7mzjp"] Mar 12 17:06:00 crc kubenswrapper[5184]: W0312 17:06:00.956559 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b98b237_d23d_424a_a15c_ec59149aba58.slice/crio-d5d4b37cca8739d7a21a0c5fc83b2110357745d53f57ceec0c2a3fb221b1617c WatchSource:0}: Error finding container d5d4b37cca8739d7a21a0c5fc83b2110357745d53f57ceec0c2a3fb221b1617c: Status 404 returned error can't find the container with id d5d4b37cca8739d7a21a0c5fc83b2110357745d53f57ceec0c2a3fb221b1617c Mar 12 17:06:01 crc kubenswrapper[5184]: I0312 17:06:01.673165 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" event={"ID":"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8","Type":"ContainerStarted","Data":"dfac490be5542aa107ef60610c105a7cef3a4b8f3d58f6e4d37ed90a376de1d0"} Mar 12 17:06:01 crc kubenswrapper[5184]: I0312 17:06:01.674891 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555586-7mzjp" event={"ID":"3b98b237-d23d-424a-a15c-ec59149aba58","Type":"ContainerStarted","Data":"d5d4b37cca8739d7a21a0c5fc83b2110357745d53f57ceec0c2a3fb221b1617c"} Mar 12 17:06:08 crc kubenswrapper[5184]: I0312 17:06:08.645970 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-8664bfd6f-t7d7f" Mar 12 17:06:08 crc kubenswrapper[5184]: I0312 17:06:08.646757 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-8558b89bff-8v768" Mar 12 17:06:08 crc kubenswrapper[5184]: I0312 17:06:08.646792 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-776f58c496-nxd4w" Mar 12 17:06:08 crc kubenswrapper[5184]: I0312 17:06:08.646822 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-579966755f-k6ws5" Mar 12 17:06:08 crc kubenswrapper[5184]: I0312 17:06:08.646852 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-849569668d-fm84v" Mar 12 17:06:08 crc kubenswrapper[5184]: I0312 17:06:08.649441 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-698d4c86bf-qhf48" Mar 12 17:06:08 crc kubenswrapper[5184]: I0312 17:06:08.649557 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-865956cc65-vqfsg" Mar 12 17:06:08 crc kubenswrapper[5184]: I0312 17:06:08.649601 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-785ff4d9b5-jvgbj" Mar 12 17:06:08 crc kubenswrapper[5184]: I0312 17:06:08.650563 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-68b4f9dfcc-twj9v" Mar 12 17:06:08 crc kubenswrapper[5184]: I0312 17:06:08.650649 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d9587945-trclj" Mar 12 17:06:09 crc kubenswrapper[5184]: I0312 17:06:09.676628 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-6f84c59bb4-9vdqk" Mar 12 17:06:09 crc kubenswrapper[5184]: I0312 17:06:09.676900 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-c845c877d-sh7h4" Mar 12 17:06:09 crc kubenswrapper[5184]: I0312 17:06:09.677063 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6564988d95-s8gjk" Mar 12 17:06:16 crc kubenswrapper[5184]: I0312 17:06:16.383000 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:06:16 crc kubenswrapper[5184]: I0312 17:06:16.394464 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8edf3dec-1386-4665-8a1a-ac779905f180-webhook-certs\") pod \"openstack-operator-controller-manager-58ddd4554c-m4npf\" (UID: \"8edf3dec-1386-4665-8a1a-ac779905f180\") " pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:06:16 crc kubenswrapper[5184]: I0312 17:06:16.655580 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack-operators\"/\"openstack-operator-controller-manager-dockercfg-vqggv\"" Mar 12 17:06:16 crc kubenswrapper[5184]: I0312 17:06:16.663160 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.576959 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf"] Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.825695 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" event={"ID":"a9fa8671-f968-45fd-a5bc-fe439e771792","Type":"ContainerStarted","Data":"95a5f805a34845f77c787c9c1edee3f0d97e372dea4263f64594e625709d9ddb"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.826624 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.827576 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" event={"ID":"be48e16f-42da-4910-81d4-1c10498247f7","Type":"ContainerStarted","Data":"51fc46ba0c0ab5098bab90f5f88c29df4c1812abaf853b1ef84527af7a5c9f37"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.827939 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.828980 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555586-7mzjp" event={"ID":"3b98b237-d23d-424a-a15c-ec59149aba58","Type":"ContainerStarted","Data":"2f0b38f45ba943215b903a45946497d845e93c8ebc8188a0302f138e3bd1247b"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.831264 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" event={"ID":"5717f5e6-3e6e-4585-b9a6-bcc31f707080","Type":"ContainerStarted","Data":"a866e87e0bde9fa768d9581d17f1f9ba3e6c464ef42ca62a12860cff9bf11a1e"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.832743 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" event={"ID":"f6a852ac-a01c-467a-96c7-d65549b557ad","Type":"ContainerStarted","Data":"dd6d31c901f3f63aec4b6dbd164cf4bda61abff4cf8e087183205d4e8b89d88a"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.833173 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.834171 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" event={"ID":"8edf3dec-1386-4665-8a1a-ac779905f180","Type":"ContainerStarted","Data":"b47f1011cea9146fe370ec0fafbcdd6bb65ef801e9584addfaa2a49ed4efb72e"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.834205 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" event={"ID":"8edf3dec-1386-4665-8a1a-ac779905f180","Type":"ContainerStarted","Data":"ff0bd00a4d9dde89f16a62afabea71c46442b248f43792b4b846b5895a3189dd"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.834541 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.835558 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" event={"ID":"8e771b12-3698-427d-a93a-2293244e2171","Type":"ContainerStarted","Data":"fe892f7ae8194c115e6df3f52d03405ad5ae21326a2c721fd9731fd7a4dd440e"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.835923 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.837820 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" event={"ID":"6f38ee12-d676-43a1-9f44-d347f24dfbda","Type":"ContainerStarted","Data":"73c37c626db9a9dedf99362941068d5ee85147c15a80e0c34b321d93752fabc4"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.838014 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.839985 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" event={"ID":"c5bf50f4-0265-46ba-98ad-c8c8245664d4","Type":"ContainerStarted","Data":"d3dff67ad1938b9f2333812f2155a1bb157486738b87fb692822e61031c00843"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.840301 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.841842 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" event={"ID":"97ae1add-72a8-4cfb-8cb4-45b33d39a1b8","Type":"ContainerStarted","Data":"f6a1e14c63eaf8fc0a05143752528f82a122f4c6776281299adf9f622ff6a716"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.842183 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.843277 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" event={"ID":"94ee0f08-539e-4c3c-a54a-46e35cd20e10","Type":"ContainerStarted","Data":"5c06ddadda8f47936c244e96ff07fa898a02a8da6a2d81c3c2d027750b12b3d4"} Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.843626 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.879744 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" podStartSLOduration=17.958241399 podStartE2EDuration="35.879725517s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:06:00.206464743 +0000 UTC m=+902.747776082" lastFinishedPulling="2026-03-12 17:06:18.127948851 +0000 UTC m=+920.669260200" observedRunningTime="2026-03-12 17:06:18.854024418 +0000 UTC m=+921.395335757" watchObservedRunningTime="2026-03-12 17:06:18.879725517 +0000 UTC m=+921.421036856" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.883006 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" podStartSLOduration=34.8829976 podStartE2EDuration="34.8829976s" podCreationTimestamp="2026-03-12 17:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:06:18.877445605 +0000 UTC m=+921.418756964" watchObservedRunningTime="2026-03-12 17:06:18.8829976 +0000 UTC m=+921.424308939" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.899349 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" podStartSLOduration=2.359181103 podStartE2EDuration="34.899331874s" podCreationTimestamp="2026-03-12 17:05:44 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.592505508 +0000 UTC m=+888.133816847" lastFinishedPulling="2026-03-12 17:06:18.132656229 +0000 UTC m=+920.673967618" observedRunningTime="2026-03-12 17:06:18.89379068 +0000 UTC m=+921.435102019" watchObservedRunningTime="2026-03-12 17:06:18.899331874 +0000 UTC m=+921.440643213" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.923852 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-85d9b55b6-v89lf" podStartSLOduration=2.385624226 podStartE2EDuration="34.923837815s" podCreationTimestamp="2026-03-12 17:05:44 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.591513637 +0000 UTC m=+888.132824976" lastFinishedPulling="2026-03-12 17:06:18.129727216 +0000 UTC m=+920.671038565" observedRunningTime="2026-03-12 17:06:18.921813622 +0000 UTC m=+921.463124961" watchObservedRunningTime="2026-03-12 17:06:18.923837815 +0000 UTC m=+921.465149154" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.978368 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" podStartSLOduration=4.249996365 podStartE2EDuration="35.978345302s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.573227549 +0000 UTC m=+888.114538898" lastFinishedPulling="2026-03-12 17:06:17.301576496 +0000 UTC m=+919.842887835" observedRunningTime="2026-03-12 17:06:18.952091555 +0000 UTC m=+921.493402974" watchObservedRunningTime="2026-03-12 17:06:18.978345302 +0000 UTC m=+921.519656641" Mar 12 17:06:18 crc kubenswrapper[5184]: I0312 17:06:18.980399 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" podStartSLOduration=18.739856116 podStartE2EDuration="35.980387776s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:06:00.892075557 +0000 UTC m=+903.433386896" lastFinishedPulling="2026-03-12 17:06:18.132607207 +0000 UTC m=+920.673918556" observedRunningTime="2026-03-12 17:06:18.974851781 +0000 UTC m=+921.516163140" watchObservedRunningTime="2026-03-12 17:06:18.980387776 +0000 UTC m=+921.521699125" Mar 12 17:06:19 crc kubenswrapper[5184]: I0312 17:06:19.004200 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" podStartSLOduration=2.506440379 podStartE2EDuration="35.004178575s" podCreationTimestamp="2026-03-12 17:05:44 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.591886818 +0000 UTC m=+888.133198157" lastFinishedPulling="2026-03-12 17:06:18.089625004 +0000 UTC m=+920.630936353" observedRunningTime="2026-03-12 17:06:18.998038672 +0000 UTC m=+921.539350021" watchObservedRunningTime="2026-03-12 17:06:19.004178575 +0000 UTC m=+921.545489924" Mar 12 17:06:19 crc kubenswrapper[5184]: I0312 17:06:19.061046 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" podStartSLOduration=2.642806619 podStartE2EDuration="35.061022694s" podCreationTimestamp="2026-03-12 17:05:44 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.575246752 +0000 UTC m=+888.116558091" lastFinishedPulling="2026-03-12 17:06:17.993462827 +0000 UTC m=+920.534774166" observedRunningTime="2026-03-12 17:06:19.056586885 +0000 UTC m=+921.597898244" watchObservedRunningTime="2026-03-12 17:06:19.061022694 +0000 UTC m=+921.602334043" Mar 12 17:06:19 crc kubenswrapper[5184]: I0312 17:06:19.062800 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" podStartSLOduration=3.541339975 podStartE2EDuration="36.06279047s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.570965407 +0000 UTC m=+888.112276746" lastFinishedPulling="2026-03-12 17:06:18.092415892 +0000 UTC m=+920.633727241" observedRunningTime="2026-03-12 17:06:19.036940176 +0000 UTC m=+921.578251515" watchObservedRunningTime="2026-03-12 17:06:19.06279047 +0000 UTC m=+921.604101829" Mar 12 17:06:19 crc kubenswrapper[5184]: I0312 17:06:19.079098 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" podStartSLOduration=3.677173874 podStartE2EDuration="36.079081033s" podCreationTimestamp="2026-03-12 17:05:43 +0000 UTC" firstStartedPulling="2026-03-12 17:05:45.591583079 +0000 UTC m=+888.132894418" lastFinishedPulling="2026-03-12 17:06:17.993490238 +0000 UTC m=+920.534801577" observedRunningTime="2026-03-12 17:06:19.076419779 +0000 UTC m=+921.617731138" watchObservedRunningTime="2026-03-12 17:06:19.079081033 +0000 UTC m=+921.620392372" Mar 12 17:06:19 crc kubenswrapper[5184]: I0312 17:06:19.113873 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555586-7mzjp" podStartSLOduration=1.941785011 podStartE2EDuration="19.113850247s" podCreationTimestamp="2026-03-12 17:06:00 +0000 UTC" firstStartedPulling="2026-03-12 17:06:00.957626139 +0000 UTC m=+903.498937478" lastFinishedPulling="2026-03-12 17:06:18.129691375 +0000 UTC m=+920.671002714" observedRunningTime="2026-03-12 17:06:19.10664234 +0000 UTC m=+921.647953699" watchObservedRunningTime="2026-03-12 17:06:19.113850247 +0000 UTC m=+921.655161586" Mar 12 17:06:19 crc kubenswrapper[5184]: I0312 17:06:19.861015 5184 generic.go:358] "Generic (PLEG): container finished" podID="3b98b237-d23d-424a-a15c-ec59149aba58" containerID="2f0b38f45ba943215b903a45946497d845e93c8ebc8188a0302f138e3bd1247b" exitCode=0 Mar 12 17:06:19 crc kubenswrapper[5184]: I0312 17:06:19.862143 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555586-7mzjp" event={"ID":"3b98b237-d23d-424a-a15c-ec59149aba58","Type":"ContainerDied","Data":"2f0b38f45ba943215b903a45946497d845e93c8ebc8188a0302f138e3bd1247b"} Mar 12 17:06:20 crc kubenswrapper[5184]: I0312 17:06:20.742498 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:06:20 crc kubenswrapper[5184]: I0312 17:06:20.742926 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:06:20 crc kubenswrapper[5184]: I0312 17:06:20.742997 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 17:06:20 crc kubenswrapper[5184]: I0312 17:06:20.744161 5184 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3dd884d50ac06fbc873c6bc95140222a52ea0e09ed17b766f377daf94c2607fe"} pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 17:06:20 crc kubenswrapper[5184]: I0312 17:06:20.744274 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" containerID="cri-o://3dd884d50ac06fbc873c6bc95140222a52ea0e09ed17b766f377daf94c2607fe" gracePeriod=600 Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.169525 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555586-7mzjp" Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.361165 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xszq6\" (UniqueName: \"kubernetes.io/projected/3b98b237-d23d-424a-a15c-ec59149aba58-kube-api-access-xszq6\") pod \"3b98b237-d23d-424a-a15c-ec59149aba58\" (UID: \"3b98b237-d23d-424a-a15c-ec59149aba58\") " Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.368107 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b98b237-d23d-424a-a15c-ec59149aba58-kube-api-access-xszq6" (OuterVolumeSpecName: "kube-api-access-xszq6") pod "3b98b237-d23d-424a-a15c-ec59149aba58" (UID: "3b98b237-d23d-424a-a15c-ec59149aba58"). InnerVolumeSpecName "kube-api-access-xszq6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.462618 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xszq6\" (UniqueName: \"kubernetes.io/projected/3b98b237-d23d-424a-a15c-ec59149aba58-kube-api-access-xszq6\") on node \"crc\" DevicePath \"\"" Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.885828 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555586-7mzjp" event={"ID":"3b98b237-d23d-424a-a15c-ec59149aba58","Type":"ContainerDied","Data":"d5d4b37cca8739d7a21a0c5fc83b2110357745d53f57ceec0c2a3fb221b1617c"} Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.885892 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5d4b37cca8739d7a21a0c5fc83b2110357745d53f57ceec0c2a3fb221b1617c" Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.885979 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555586-7mzjp" Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.890632 5184 generic.go:358] "Generic (PLEG): container finished" podID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerID="3dd884d50ac06fbc873c6bc95140222a52ea0e09ed17b766f377daf94c2607fe" exitCode=0 Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.890720 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerDied","Data":"3dd884d50ac06fbc873c6bc95140222a52ea0e09ed17b766f377daf94c2607fe"} Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.890795 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"e97f86449204164890c97bdd96ba2e452210b3be2c7fc1815ab56658e4653bed"} Mar 12 17:06:21 crc kubenswrapper[5184]: I0312 17:06:21.890834 5184 scope.go:117] "RemoveContainer" containerID="6dcddf4c82a491a243d037b62a542200cd43f90af290d25abaab07cac5e2a61e" Mar 12 17:06:22 crc kubenswrapper[5184]: I0312 17:06:22.254482 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555580-966mx"] Mar 12 17:06:22 crc kubenswrapper[5184]: I0312 17:06:22.265158 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555580-966mx"] Mar 12 17:06:22 crc kubenswrapper[5184]: I0312 17:06:22.414111 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a087d4e-ec76-454a-8d5f-74144f387a03" path="/var/lib/kubelet/pods/4a087d4e-ec76-454a-8d5f-74144f387a03/volumes" Mar 12 17:06:29 crc kubenswrapper[5184]: I0312 17:06:29.866806 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-7dd8b74947-8jj2t" Mar 12 17:06:29 crc kubenswrapper[5184]: I0312 17:06:29.867791 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-847cdc49c9-7smvt" Mar 12 17:06:29 crc kubenswrapper[5184]: I0312 17:06:29.867867 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5f84d557f9-hvp27" Mar 12 17:06:29 crc kubenswrapper[5184]: I0312 17:06:29.869604 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-74d567479f-nrs7s" Mar 12 17:06:29 crc kubenswrapper[5184]: I0312 17:06:29.870362 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd977d774-djcpf" Mar 12 17:06:29 crc kubenswrapper[5184]: I0312 17:06:29.870548 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-688f7d67f5-ggjlw" Mar 12 17:06:30 crc kubenswrapper[5184]: I0312 17:06:30.878779 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n" Mar 12 17:06:30 crc kubenswrapper[5184]: I0312 17:06:30.878876 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54654cd4c7-x7588" Mar 12 17:06:30 crc kubenswrapper[5184]: I0312 17:06:30.880520 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-58ddd4554c-m4npf" Mar 12 17:06:48 crc kubenswrapper[5184]: E0312 17:06:48.588088 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1147744 actualBytes=10240 Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.241038 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-dqv7z"] Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.242181 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b98b237-d23d-424a-a15c-ec59149aba58" containerName="oc" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.242207 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b98b237-d23d-424a-a15c-ec59149aba58" containerName="oc" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.242409 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b98b237-d23d-424a-a15c-ec59149aba58" containerName="oc" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.278528 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-dqv7z"] Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.278680 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.280956 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dnsmasq-dns-dockercfg-jrzkt\"" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.281010 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"dns\"" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.283752 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openshift-service-ca.crt\"" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.283965 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"kube-root-ca.crt\"" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.316041 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-57kzz"] Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.342496 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2dvj\" (UniqueName: \"kubernetes.io/projected/0859b4b6-893b-4163-980f-79c27966ed84-kube-api-access-v2dvj\") pod \"dnsmasq-dns-5dbf879849-dqv7z\" (UID: \"0859b4b6-893b-4163-980f-79c27966ed84\") " pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.342634 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0859b4b6-893b-4163-980f-79c27966ed84-config\") pod \"dnsmasq-dns-5dbf879849-dqv7z\" (UID: \"0859b4b6-893b-4163-980f-79c27966ed84\") " pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.347255 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-57kzz"] Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.347492 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.350829 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"dns-svc\"" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.443634 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2dvj\" (UniqueName: \"kubernetes.io/projected/0859b4b6-893b-4163-980f-79c27966ed84-kube-api-access-v2dvj\") pod \"dnsmasq-dns-5dbf879849-dqv7z\" (UID: \"0859b4b6-893b-4163-980f-79c27966ed84\") " pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.444154 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0859b4b6-893b-4163-980f-79c27966ed84-config\") pod \"dnsmasq-dns-5dbf879849-dqv7z\" (UID: \"0859b4b6-893b-4163-980f-79c27966ed84\") " pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.445284 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0859b4b6-893b-4163-980f-79c27966ed84-config\") pod \"dnsmasq-dns-5dbf879849-dqv7z\" (UID: \"0859b4b6-893b-4163-980f-79c27966ed84\") " pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.462118 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2dvj\" (UniqueName: \"kubernetes.io/projected/0859b4b6-893b-4163-980f-79c27966ed84-kube-api-access-v2dvj\") pod \"dnsmasq-dns-5dbf879849-dqv7z\" (UID: \"0859b4b6-893b-4163-980f-79c27966ed84\") " pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.545531 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-config\") pod \"dnsmasq-dns-6f8cccd557-57kzz\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.545586 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvckt\" (UniqueName: \"kubernetes.io/projected/d9b38e02-6553-49b5-a966-f266cae0c098-kube-api-access-fvckt\") pod \"dnsmasq-dns-6f8cccd557-57kzz\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.545643 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-dns-svc\") pod \"dnsmasq-dns-6f8cccd557-57kzz\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.596812 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.646930 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-config\") pod \"dnsmasq-dns-6f8cccd557-57kzz\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.647200 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvckt\" (UniqueName: \"kubernetes.io/projected/d9b38e02-6553-49b5-a966-f266cae0c098-kube-api-access-fvckt\") pod \"dnsmasq-dns-6f8cccd557-57kzz\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.647394 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-dns-svc\") pod \"dnsmasq-dns-6f8cccd557-57kzz\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.648457 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-config\") pod \"dnsmasq-dns-6f8cccd557-57kzz\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.648805 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-dns-svc\") pod \"dnsmasq-dns-6f8cccd557-57kzz\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.690857 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvckt\" (UniqueName: \"kubernetes.io/projected/d9b38e02-6553-49b5-a966-f266cae0c098-kube-api-access-fvckt\") pod \"dnsmasq-dns-6f8cccd557-57kzz\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.833146 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-dqv7z"] Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.844444 5184 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 17:06:49 crc kubenswrapper[5184]: I0312 17:06:49.965794 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:06:50 crc kubenswrapper[5184]: I0312 17:06:50.165106 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" event={"ID":"0859b4b6-893b-4163-980f-79c27966ed84","Type":"ContainerStarted","Data":"75fa6fde71f4cd21d4357cfb3369d0360219a73808b8fa5c172c4f8fac758728"} Mar 12 17:06:50 crc kubenswrapper[5184]: I0312 17:06:50.174220 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-57kzz"] Mar 12 17:06:50 crc kubenswrapper[5184]: W0312 17:06:50.177888 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9b38e02_6553_49b5_a966_f266cae0c098.slice/crio-a5c60b3dc5b0f97a3c8a36852a35fd067e5029cf9daaf06c42dcecd35afca434 WatchSource:0}: Error finding container a5c60b3dc5b0f97a3c8a36852a35fd067e5029cf9daaf06c42dcecd35afca434: Status 404 returned error can't find the container with id a5c60b3dc5b0f97a3c8a36852a35fd067e5029cf9daaf06c42dcecd35afca434 Mar 12 17:06:51 crc kubenswrapper[5184]: I0312 17:06:51.173244 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" event={"ID":"d9b38e02-6553-49b5-a966-f266cae0c098","Type":"ContainerStarted","Data":"a5c60b3dc5b0f97a3c8a36852a35fd067e5029cf9daaf06c42dcecd35afca434"} Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.032457 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-dqv7z"] Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.055258 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-67jsf"] Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.075452 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-67jsf"] Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.075644 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.179340 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt6hg\" (UniqueName: \"kubernetes.io/projected/a6cce4a5-d42c-4599-9319-30e850b844f5-kube-api-access-xt6hg\") pod \"dnsmasq-dns-588bd8c8c5-67jsf\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.180034 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-config\") pod \"dnsmasq-dns-588bd8c8c5-67jsf\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.180160 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-dns-svc\") pod \"dnsmasq-dns-588bd8c8c5-67jsf\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.281648 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt6hg\" (UniqueName: \"kubernetes.io/projected/a6cce4a5-d42c-4599-9319-30e850b844f5-kube-api-access-xt6hg\") pod \"dnsmasq-dns-588bd8c8c5-67jsf\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.281702 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-config\") pod \"dnsmasq-dns-588bd8c8c5-67jsf\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.281750 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-dns-svc\") pod \"dnsmasq-dns-588bd8c8c5-67jsf\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.282641 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-dns-svc\") pod \"dnsmasq-dns-588bd8c8c5-67jsf\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.285882 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-config\") pod \"dnsmasq-dns-588bd8c8c5-67jsf\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.323211 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt6hg\" (UniqueName: \"kubernetes.io/projected/a6cce4a5-d42c-4599-9319-30e850b844f5-kube-api-access-xt6hg\") pod \"dnsmasq-dns-588bd8c8c5-67jsf\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.341615 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-57kzz"] Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.367563 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-4nmdf"] Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.382131 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-4nmdf"] Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.382341 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.427063 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.484924 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlvsr\" (UniqueName: \"kubernetes.io/projected/ae6b6e98-f5e4-4418-b2c1-105a52da746b-kube-api-access-hlvsr\") pod \"dnsmasq-dns-6686bbb8b9-4nmdf\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.484995 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-config\") pod \"dnsmasq-dns-6686bbb8b9-4nmdf\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.485062 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-dns-svc\") pod \"dnsmasq-dns-6686bbb8b9-4nmdf\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.586184 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-dns-svc\") pod \"dnsmasq-dns-6686bbb8b9-4nmdf\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.586281 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlvsr\" (UniqueName: \"kubernetes.io/projected/ae6b6e98-f5e4-4418-b2c1-105a52da746b-kube-api-access-hlvsr\") pod \"dnsmasq-dns-6686bbb8b9-4nmdf\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.586334 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-config\") pod \"dnsmasq-dns-6686bbb8b9-4nmdf\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.587099 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-config\") pod \"dnsmasq-dns-6686bbb8b9-4nmdf\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.587156 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-dns-svc\") pod \"dnsmasq-dns-6686bbb8b9-4nmdf\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.607684 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlvsr\" (UniqueName: \"kubernetes.io/projected/ae6b6e98-f5e4-4418-b2c1-105a52da746b-kube-api-access-hlvsr\") pod \"dnsmasq-dns-6686bbb8b9-4nmdf\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:52 crc kubenswrapper[5184]: I0312 17:06:52.715603 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.246835 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.252919 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.253022 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.256403 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-config-data\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.260757 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-erlang-cookie\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.261352 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-default-user\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.261458 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-plugins-conf\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.261352 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-server-conf\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.268309 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-rabbitmq-cell1-svc\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.268551 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-server-dockercfg-9pbd8\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.298668 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.298862 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56b9c26f-b490-4262-9c35-63ee5734c634-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.298900 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwghs\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-kube-api-access-vwghs\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.298943 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.298969 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.299003 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.299030 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.299079 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.299102 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.299129 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.299513 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56b9c26f-b490-4262-9c35-63ee5734c634-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406033 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406099 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406136 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406167 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406217 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406243 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406260 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406303 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56b9c26f-b490-4262-9c35-63ee5734c634-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406340 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406384 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56b9c26f-b490-4262-9c35-63ee5734c634-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.406439 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwghs\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-kube-api-access-vwghs\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.408121 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.408426 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.408443 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.408545 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.410157 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.429922 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.429939 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56b9c26f-b490-4262-9c35-63ee5734c634-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.436950 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56b9c26f-b490-4262-9c35-63ee5734c634-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.437510 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwghs\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-kube-api-access-vwghs\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.439590 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.440596 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.453740 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.508151 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.516149 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.518108 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-erlang-cookie\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.518626 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-config-data\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.518759 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-default-user\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.519043 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-server-dockercfg-7fzrs\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.519187 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-rabbitmq-svc\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.519321 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-plugins-conf\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.519456 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-server-conf\"" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.520248 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.587789 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.610479 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.610553 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.610738 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.610787 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-config-data\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.610816 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.610914 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53e57ab8-13e6-4505-a905-412d3ef88083-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.610955 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53e57ab8-13e6-4505-a905-412d3ef88083-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.610990 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.611087 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.611112 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.611143 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r45ps\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-kube-api-access-r45ps\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.712803 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53e57ab8-13e6-4505-a905-412d3ef88083-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.712847 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.713637 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.713712 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.713731 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.713750 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r45ps\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-kube-api-access-r45ps\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.713804 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.713846 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.713881 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.713899 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-config-data\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.713914 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.713942 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53e57ab8-13e6-4505-a905-412d3ef88083-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.714853 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.715111 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.715210 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.715523 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.715655 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-config-data\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.717544 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53e57ab8-13e6-4505-a905-412d3ef88083-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.721007 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.727013 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53e57ab8-13e6-4505-a905-412d3ef88083-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.734575 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r45ps\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-kube-api-access-r45ps\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.737062 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.737191 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " pod="openstack/rabbitmq-server-0" Mar 12 17:06:53 crc kubenswrapper[5184]: I0312 17:06:53.888343 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.657149 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.669865 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.671344 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.672595 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"galera-openstack-dockercfg-ck2sh\"" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.675837 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-galera-openstack-svc\"" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.676111 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-scripts\"" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.676324 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-config-data\"" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.680876 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"combined-ca-bundle\"" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.728725 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/043ee884-91ea-43b8-8b26-c8e85e3df303-config-data-generated\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.728799 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/043ee884-91ea-43b8-8b26-c8e85e3df303-kolla-config\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.728921 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/043ee884-91ea-43b8-8b26-c8e85e3df303-config-data-default\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.729064 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043ee884-91ea-43b8-8b26-c8e85e3df303-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.729150 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/043ee884-91ea-43b8-8b26-c8e85e3df303-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.729190 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/043ee884-91ea-43b8-8b26-c8e85e3df303-operator-scripts\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.729364 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plg4t\" (UniqueName: \"kubernetes.io/projected/043ee884-91ea-43b8-8b26-c8e85e3df303-kube-api-access-plg4t\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.729507 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.830547 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/043ee884-91ea-43b8-8b26-c8e85e3df303-config-data-default\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.830895 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043ee884-91ea-43b8-8b26-c8e85e3df303-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.831005 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/043ee884-91ea-43b8-8b26-c8e85e3df303-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.831109 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/043ee884-91ea-43b8-8b26-c8e85e3df303-operator-scripts\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.831275 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plg4t\" (UniqueName: \"kubernetes.io/projected/043ee884-91ea-43b8-8b26-c8e85e3df303-kube-api-access-plg4t\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.831471 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.831607 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/043ee884-91ea-43b8-8b26-c8e85e3df303-config-data-generated\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.831733 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/043ee884-91ea-43b8-8b26-c8e85e3df303-kolla-config\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.832523 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/043ee884-91ea-43b8-8b26-c8e85e3df303-config-data-default\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.833927 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/043ee884-91ea-43b8-8b26-c8e85e3df303-config-data-generated\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.834989 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.835245 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/043ee884-91ea-43b8-8b26-c8e85e3df303-operator-scripts\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.837590 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/043ee884-91ea-43b8-8b26-c8e85e3df303-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.838216 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/043ee884-91ea-43b8-8b26-c8e85e3df303-kolla-config\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.839749 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043ee884-91ea-43b8-8b26-c8e85e3df303-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.853631 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plg4t\" (UniqueName: \"kubernetes.io/projected/043ee884-91ea-43b8-8b26-c8e85e3df303-kube-api-access-plg4t\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:54 crc kubenswrapper[5184]: I0312 17:06:54.854883 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"043ee884-91ea-43b8-8b26-c8e85e3df303\") " pod="openstack/openstack-galera-0" Mar 12 17:06:55 crc kubenswrapper[5184]: I0312 17:06:55.026011 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.013485 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.021415 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.029450 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-cell1-scripts\"" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.030523 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"galera-openstack-cell1-dockercfg-wphs5\"" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.031140 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-cell1-config-data\"" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.032127 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-galera-openstack-cell1-svc\"" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.052378 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.153702 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c743985-027b-46df-8a0d-5a246406a2d3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.153792 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c743985-027b-46df-8a0d-5a246406a2d3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.153832 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c743985-027b-46df-8a0d-5a246406a2d3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.153855 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c743985-027b-46df-8a0d-5a246406a2d3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.153903 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.153931 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dq56\" (UniqueName: \"kubernetes.io/projected/9c743985-027b-46df-8a0d-5a246406a2d3-kube-api-access-8dq56\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.154255 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c743985-027b-46df-8a0d-5a246406a2d3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.154399 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c743985-027b-46df-8a0d-5a246406a2d3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.256427 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c743985-027b-46df-8a0d-5a246406a2d3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.256519 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c743985-027b-46df-8a0d-5a246406a2d3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.256596 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c743985-027b-46df-8a0d-5a246406a2d3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.256651 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c743985-027b-46df-8a0d-5a246406a2d3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.256739 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c743985-027b-46df-8a0d-5a246406a2d3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.256770 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c743985-027b-46df-8a0d-5a246406a2d3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.257279 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.257309 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dq56\" (UniqueName: \"kubernetes.io/projected/9c743985-027b-46df-8a0d-5a246406a2d3-kube-api-access-8dq56\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.257308 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c743985-027b-46df-8a0d-5a246406a2d3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.257307 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c743985-027b-46df-8a0d-5a246406a2d3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.257603 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.257720 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c743985-027b-46df-8a0d-5a246406a2d3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.257981 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c743985-027b-46df-8a0d-5a246406a2d3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.262444 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c743985-027b-46df-8a0d-5a246406a2d3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.265076 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c743985-027b-46df-8a0d-5a246406a2d3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.273912 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dq56\" (UniqueName: \"kubernetes.io/projected/9c743985-027b-46df-8a0d-5a246406a2d3-kube-api-access-8dq56\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.285336 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9c743985-027b-46df-8a0d-5a246406a2d3\") " pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.353871 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.358292 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.361457 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.364881 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"memcached-memcached-dockercfg-tg2wc\"" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.365092 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-memcached-svc\"" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.365284 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"memcached-config-data\"" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.380496 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.461558 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01a0600d-d61f-4822-a177-fbe86d075f38-config-data\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.461644 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/01a0600d-d61f-4822-a177-fbe86d075f38-memcached-tls-certs\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.461748 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trh94\" (UniqueName: \"kubernetes.io/projected/01a0600d-d61f-4822-a177-fbe86d075f38-kube-api-access-trh94\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.461853 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a0600d-d61f-4822-a177-fbe86d075f38-combined-ca-bundle\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.461901 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01a0600d-d61f-4822-a177-fbe86d075f38-kolla-config\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.565191 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01a0600d-d61f-4822-a177-fbe86d075f38-config-data\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.565251 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/01a0600d-d61f-4822-a177-fbe86d075f38-memcached-tls-certs\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.565367 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trh94\" (UniqueName: \"kubernetes.io/projected/01a0600d-d61f-4822-a177-fbe86d075f38-kube-api-access-trh94\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.565413 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a0600d-d61f-4822-a177-fbe86d075f38-combined-ca-bundle\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.565455 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01a0600d-d61f-4822-a177-fbe86d075f38-kolla-config\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.567028 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01a0600d-d61f-4822-a177-fbe86d075f38-kolla-config\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.567385 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01a0600d-d61f-4822-a177-fbe86d075f38-config-data\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.570130 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/01a0600d-d61f-4822-a177-fbe86d075f38-memcached-tls-certs\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.571712 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a0600d-d61f-4822-a177-fbe86d075f38-combined-ca-bundle\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.588636 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trh94\" (UniqueName: \"kubernetes.io/projected/01a0600d-d61f-4822-a177-fbe86d075f38-kube-api-access-trh94\") pod \"memcached-0\" (UID: \"01a0600d-d61f-4822-a177-fbe86d075f38\") " pod="openstack/memcached-0" Mar 12 17:06:56 crc kubenswrapper[5184]: I0312 17:06:56.680782 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 12 17:06:58 crc kubenswrapper[5184]: I0312 17:06:58.470315 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 17:06:58 crc kubenswrapper[5184]: I0312 17:06:58.608048 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 17:06:58 crc kubenswrapper[5184]: I0312 17:06:58.608208 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 17:06:58 crc kubenswrapper[5184]: I0312 17:06:58.610032 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"telemetry-ceilometer-dockercfg-p9hwj\"" Mar 12 17:06:58 crc kubenswrapper[5184]: I0312 17:06:58.702465 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28n4s\" (UniqueName: \"kubernetes.io/projected/37dd5ca0-dd94-458b-93c2-393f9c4db4b7-kube-api-access-28n4s\") pod \"kube-state-metrics-0\" (UID: \"37dd5ca0-dd94-458b-93c2-393f9c4db4b7\") " pod="openstack/kube-state-metrics-0" Mar 12 17:06:58 crc kubenswrapper[5184]: I0312 17:06:58.805419 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28n4s\" (UniqueName: \"kubernetes.io/projected/37dd5ca0-dd94-458b-93c2-393f9c4db4b7-kube-api-access-28n4s\") pod \"kube-state-metrics-0\" (UID: \"37dd5ca0-dd94-458b-93c2-393f9c4db4b7\") " pod="openstack/kube-state-metrics-0" Mar 12 17:06:58 crc kubenswrapper[5184]: I0312 17:06:58.824115 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28n4s\" (UniqueName: \"kubernetes.io/projected/37dd5ca0-dd94-458b-93c2-393f9c4db4b7-kube-api-access-28n4s\") pod \"kube-state-metrics-0\" (UID: \"37dd5ca0-dd94-458b-93c2-393f9c4db4b7\") " pod="openstack/kube-state-metrics-0" Mar 12 17:06:58 crc kubenswrapper[5184]: I0312 17:06:58.930978 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 17:06:59 crc kubenswrapper[5184]: I0312 17:06:59.218641 5184 scope.go:117] "RemoveContainer" containerID="243558ef2a97890d490b80f1334c16b699081b563491e7f2f132f0374e30649c" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.587486 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dq7bv"] Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.702032 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dq7bv"] Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.702068 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-vp7v2"] Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.702872 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.705202 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovncontroller-scripts\"" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.705272 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovncontroller-ovncontroller-dockercfg-42l5s\"" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.705280 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ovncontroller-ovndbs\"" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.710192 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vp7v2"] Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.710306 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.749085 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-var-lib\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.749135 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swbng\" (UniqueName: \"kubernetes.io/projected/303ab57c-305d-48c2-a789-7a124144968d-kube-api-access-swbng\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.749180 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5a0c031-5c42-4559-96f2-82b75e70b804-var-run\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.749355 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/303ab57c-305d-48c2-a789-7a124144968d-scripts\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.749557 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-var-log\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.749692 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8986\" (UniqueName: \"kubernetes.io/projected/d5a0c031-5c42-4559-96f2-82b75e70b804-kube-api-access-j8986\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.749781 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-etc-ovs\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.749877 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a0c031-5c42-4559-96f2-82b75e70b804-ovn-controller-tls-certs\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.749913 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-var-run\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.749979 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5a0c031-5c42-4559-96f2-82b75e70b804-scripts\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.750081 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a0c031-5c42-4559-96f2-82b75e70b804-combined-ca-bundle\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.750124 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a0c031-5c42-4559-96f2-82b75e70b804-var-run-ovn\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.750149 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a0c031-5c42-4559-96f2-82b75e70b804-var-log-ovn\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855168 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a0c031-5c42-4559-96f2-82b75e70b804-combined-ca-bundle\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855220 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a0c031-5c42-4559-96f2-82b75e70b804-var-run-ovn\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855242 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a0c031-5c42-4559-96f2-82b75e70b804-var-log-ovn\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855305 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-var-lib\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855343 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swbng\" (UniqueName: \"kubernetes.io/projected/303ab57c-305d-48c2-a789-7a124144968d-kube-api-access-swbng\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855445 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5a0c031-5c42-4559-96f2-82b75e70b804-var-run\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855479 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/303ab57c-305d-48c2-a789-7a124144968d-scripts\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855515 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-var-log\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855544 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8986\" (UniqueName: \"kubernetes.io/projected/d5a0c031-5c42-4559-96f2-82b75e70b804-kube-api-access-j8986\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855576 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-etc-ovs\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855600 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a0c031-5c42-4559-96f2-82b75e70b804-ovn-controller-tls-certs\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855627 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-var-run\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855650 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5a0c031-5c42-4559-96f2-82b75e70b804-scripts\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.855832 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a0c031-5c42-4559-96f2-82b75e70b804-var-run-ovn\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.857925 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5a0c031-5c42-4559-96f2-82b75e70b804-scripts\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.858065 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/303ab57c-305d-48c2-a789-7a124144968d-scripts\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.858103 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-var-log\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.858207 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5a0c031-5c42-4559-96f2-82b75e70b804-var-log-ovn\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.858349 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-var-lib\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.858910 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-etc-ovs\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.858973 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/303ab57c-305d-48c2-a789-7a124144968d-var-run\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.859369 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5a0c031-5c42-4559-96f2-82b75e70b804-var-run\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.865198 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a0c031-5c42-4559-96f2-82b75e70b804-combined-ca-bundle\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.875324 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a0c031-5c42-4559-96f2-82b75e70b804-ovn-controller-tls-certs\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.878662 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swbng\" (UniqueName: \"kubernetes.io/projected/303ab57c-305d-48c2-a789-7a124144968d-kube-api-access-swbng\") pod \"ovn-controller-ovs-vp7v2\" (UID: \"303ab57c-305d-48c2-a789-7a124144968d\") " pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:01 crc kubenswrapper[5184]: I0312 17:07:01.880919 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8986\" (UniqueName: \"kubernetes.io/projected/d5a0c031-5c42-4559-96f2-82b75e70b804-kube-api-access-j8986\") pod \"ovn-controller-dq7bv\" (UID: \"d5a0c031-5c42-4559-96f2-82b75e70b804\") " pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.028757 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.038600 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.203850 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.230169 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.230306 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.248579 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ovndbcluster-nb-ovndbs\"" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.248672 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-nb-config\"" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.249196 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-nb-scripts\"" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.248712 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovncluster-ovndbcluster-nb-dockercfg-4dr7n\"" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.248991 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ovn-metrics\"" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.369004 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7755768b-45c3-4cab-be56-d9be437d70d1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.369059 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7755768b-45c3-4cab-be56-d9be437d70d1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.369083 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s845\" (UniqueName: \"kubernetes.io/projected/7755768b-45c3-4cab-be56-d9be437d70d1-kube-api-access-5s845\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.369178 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.369254 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7755768b-45c3-4cab-be56-d9be437d70d1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.369305 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7755768b-45c3-4cab-be56-d9be437d70d1-config\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.369330 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7755768b-45c3-4cab-be56-d9be437d70d1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.369353 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7755768b-45c3-4cab-be56-d9be437d70d1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.471246 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7755768b-45c3-4cab-be56-d9be437d70d1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.472044 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7755768b-45c3-4cab-be56-d9be437d70d1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.472071 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7755768b-45c3-4cab-be56-d9be437d70d1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.472505 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s845\" (UniqueName: \"kubernetes.io/projected/7755768b-45c3-4cab-be56-d9be437d70d1-kube-api-access-5s845\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.472770 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.472934 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7755768b-45c3-4cab-be56-d9be437d70d1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.473009 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.473227 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7755768b-45c3-4cab-be56-d9be437d70d1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.473316 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7755768b-45c3-4cab-be56-d9be437d70d1-config\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.473339 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7755768b-45c3-4cab-be56-d9be437d70d1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.473767 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7755768b-45c3-4cab-be56-d9be437d70d1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.475686 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7755768b-45c3-4cab-be56-d9be437d70d1-config\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.476968 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7755768b-45c3-4cab-be56-d9be437d70d1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.477807 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7755768b-45c3-4cab-be56-d9be437d70d1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.481178 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7755768b-45c3-4cab-be56-d9be437d70d1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.494162 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s845\" (UniqueName: \"kubernetes.io/projected/7755768b-45c3-4cab-be56-d9be437d70d1-kube-api-access-5s845\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.498615 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7755768b-45c3-4cab-be56-d9be437d70d1\") " pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:02 crc kubenswrapper[5184]: I0312 17:07:02.574331 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:03 crc kubenswrapper[5184]: I0312 17:07:03.982955 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-67jsf"] Mar 12 17:07:04 crc kubenswrapper[5184]: I0312 17:07:04.673534 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.072429 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.076878 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 17:07:05 crc kubenswrapper[5184]: W0312 17:07:05.079658 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod043ee884_91ea_43b8_8b26_c8e85e3df303.slice/crio-01a0f80803938a67ec80f0de1cfa503cc137a05d21825680c71fbdb4ebf918f4 WatchSource:0}: Error finding container 01a0f80803938a67ec80f0de1cfa503cc137a05d21825680c71fbdb4ebf918f4: Status 404 returned error can't find the container with id 01a0f80803938a67ec80f0de1cfa503cc137a05d21825680c71fbdb4ebf918f4 Mar 12 17:07:05 crc kubenswrapper[5184]: W0312 17:07:05.080676 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c743985_027b_46df_8a0d_5a246406a2d3.slice/crio-83eb24bedea24b7903f7a70f2107e1fa26e5776c9d52a6e70b1dd0dd48eb27c8 WatchSource:0}: Error finding container 83eb24bedea24b7903f7a70f2107e1fa26e5776c9d52a6e70b1dd0dd48eb27c8: Status 404 returned error can't find the container with id 83eb24bedea24b7903f7a70f2107e1fa26e5776c9d52a6e70b1dd0dd48eb27c8 Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.244420 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-4nmdf"] Mar 12 17:07:05 crc kubenswrapper[5184]: W0312 17:07:05.248348 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae6b6e98_f5e4_4418_b2c1_105a52da746b.slice/crio-43944b0ad5fcaa8b4f82bff763f805b25ed4873ab805e4828de57368f7e4ee04 WatchSource:0}: Error finding container 43944b0ad5fcaa8b4f82bff763f805b25ed4873ab805e4828de57368f7e4ee04: Status 404 returned error can't find the container with id 43944b0ad5fcaa8b4f82bff763f805b25ed4873ab805e4828de57368f7e4ee04 Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.252357 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 17:07:05 crc kubenswrapper[5184]: W0312 17:07:05.267700 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b9c26f_b490_4262_9c35_63ee5734c634.slice/crio-2bac33f30ee22f3d50f182c453325bdb186d39faaffa8a93c4fe7f5e2cc8873c WatchSource:0}: Error finding container 2bac33f30ee22f3d50f182c453325bdb186d39faaffa8a93c4fe7f5e2cc8873c: Status 404 returned error can't find the container with id 2bac33f30ee22f3d50f182c453325bdb186d39faaffa8a93c4fe7f5e2cc8873c Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.300227 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56b9c26f-b490-4262-9c35-63ee5734c634","Type":"ContainerStarted","Data":"2bac33f30ee22f3d50f182c453325bdb186d39faaffa8a93c4fe7f5e2cc8873c"} Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.301765 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" event={"ID":"ae6b6e98-f5e4-4418-b2c1-105a52da746b","Type":"ContainerStarted","Data":"43944b0ad5fcaa8b4f82bff763f805b25ed4873ab805e4828de57368f7e4ee04"} Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.303673 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53e57ab8-13e6-4505-a905-412d3ef88083","Type":"ContainerStarted","Data":"9bc34545e49aea34055d9d8a2ea9a93f9d8639bf6be30ec3df37f1c3666980f9"} Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.304742 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9c743985-027b-46df-8a0d-5a246406a2d3","Type":"ContainerStarted","Data":"83eb24bedea24b7903f7a70f2107e1fa26e5776c9d52a6e70b1dd0dd48eb27c8"} Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.310146 5184 generic.go:358] "Generic (PLEG): container finished" podID="d9b38e02-6553-49b5-a966-f266cae0c098" containerID="97b8d909065f4cbf48b8e37d1de4800137e991bac260ff92a5534e2c115a337b" exitCode=0 Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.310410 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" event={"ID":"d9b38e02-6553-49b5-a966-f266cae0c098","Type":"ContainerDied","Data":"97b8d909065f4cbf48b8e37d1de4800137e991bac260ff92a5534e2c115a337b"} Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.317837 5184 generic.go:358] "Generic (PLEG): container finished" podID="0859b4b6-893b-4163-980f-79c27966ed84" containerID="57c2c0ec579426ab25488d628beacfdbdde822545e35d09ac59d1ca1d158fd52" exitCode=0 Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.317923 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" event={"ID":"0859b4b6-893b-4163-980f-79c27966ed84","Type":"ContainerDied","Data":"57c2c0ec579426ab25488d628beacfdbdde822545e35d09ac59d1ca1d158fd52"} Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.323109 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"043ee884-91ea-43b8-8b26-c8e85e3df303","Type":"ContainerStarted","Data":"01a0f80803938a67ec80f0de1cfa503cc137a05d21825680c71fbdb4ebf918f4"} Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.331391 5184 generic.go:358] "Generic (PLEG): container finished" podID="a6cce4a5-d42c-4599-9319-30e850b844f5" containerID="3aa5746447337e8f540aac81c42f999892b313ce2dd32bb753981198cf2bd6b6" exitCode=0 Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.331472 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" event={"ID":"a6cce4a5-d42c-4599-9319-30e850b844f5","Type":"ContainerDied","Data":"3aa5746447337e8f540aac81c42f999892b313ce2dd32bb753981198cf2bd6b6"} Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.331533 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" event={"ID":"a6cce4a5-d42c-4599-9319-30e850b844f5","Type":"ContainerStarted","Data":"f57efa67c4449e962e273e0c64283dd360c0d66431bf7b7c50f1e42cd503c9b4"} Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.535468 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.542285 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dq7bv"] Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.567675 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.692301 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 17:07:05 crc kubenswrapper[5184]: W0312 17:07:05.696954 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7755768b_45c3_4cab_be56_d9be437d70d1.slice/crio-1803991ec308dba96ddf183469890886db16542bd91db6aa2db34603b89e1083 WatchSource:0}: Error finding container 1803991ec308dba96ddf183469890886db16542bd91db6aa2db34603b89e1083: Status 404 returned error can't find the container with id 1803991ec308dba96ddf183469890886db16542bd91db6aa2db34603b89e1083 Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.751779 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.754481 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.833865 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvckt\" (UniqueName: \"kubernetes.io/projected/d9b38e02-6553-49b5-a966-f266cae0c098-kube-api-access-fvckt\") pod \"d9b38e02-6553-49b5-a966-f266cae0c098\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.833957 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-dns-svc\") pod \"d9b38e02-6553-49b5-a966-f266cae0c098\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.834006 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-config\") pod \"d9b38e02-6553-49b5-a966-f266cae0c098\" (UID: \"d9b38e02-6553-49b5-a966-f266cae0c098\") " Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.835022 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0859b4b6-893b-4163-980f-79c27966ed84-config\") pod \"0859b4b6-893b-4163-980f-79c27966ed84\" (UID: \"0859b4b6-893b-4163-980f-79c27966ed84\") " Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.835064 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2dvj\" (UniqueName: \"kubernetes.io/projected/0859b4b6-893b-4163-980f-79c27966ed84-kube-api-access-v2dvj\") pod \"0859b4b6-893b-4163-980f-79c27966ed84\" (UID: \"0859b4b6-893b-4163-980f-79c27966ed84\") " Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.840853 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0859b4b6-893b-4163-980f-79c27966ed84-kube-api-access-v2dvj" (OuterVolumeSpecName: "kube-api-access-v2dvj") pod "0859b4b6-893b-4163-980f-79c27966ed84" (UID: "0859b4b6-893b-4163-980f-79c27966ed84"). InnerVolumeSpecName "kube-api-access-v2dvj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.841093 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b38e02-6553-49b5-a966-f266cae0c098-kube-api-access-fvckt" (OuterVolumeSpecName: "kube-api-access-fvckt") pod "d9b38e02-6553-49b5-a966-f266cae0c098" (UID: "d9b38e02-6553-49b5-a966-f266cae0c098"). InnerVolumeSpecName "kube-api-access-fvckt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.853662 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-config" (OuterVolumeSpecName: "config") pod "d9b38e02-6553-49b5-a966-f266cae0c098" (UID: "d9b38e02-6553-49b5-a966-f266cae0c098"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.853749 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d9b38e02-6553-49b5-a966-f266cae0c098" (UID: "d9b38e02-6553-49b5-a966-f266cae0c098"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.854568 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0859b4b6-893b-4163-980f-79c27966ed84-config" (OuterVolumeSpecName: "config") pod "0859b4b6-893b-4163-980f-79c27966ed84" (UID: "0859b4b6-893b-4163-980f-79c27966ed84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.912325 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.913673 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0859b4b6-893b-4163-980f-79c27966ed84" containerName="init" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.913694 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="0859b4b6-893b-4163-980f-79c27966ed84" containerName="init" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.913787 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9b38e02-6553-49b5-a966-f266cae0c098" containerName="init" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.913796 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b38e02-6553-49b5-a966-f266cae0c098" containerName="init" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.913960 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="0859b4b6-893b-4163-980f-79c27966ed84" containerName="init" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.913980 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9b38e02-6553-49b5-a966-f266cae0c098" containerName="init" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.921854 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.922025 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.924337 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-sb-config\"" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.924996 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovncluster-ovndbcluster-sb-dockercfg-9vq47\"" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.926140 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovndbcluster-sb-scripts\"" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.926338 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ovndbcluster-sb-ovndbs\"" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.937509 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fvckt\" (UniqueName: \"kubernetes.io/projected/d9b38e02-6553-49b5-a966-f266cae0c098-kube-api-access-fvckt\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.937531 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.937539 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b38e02-6553-49b5-a966-f266cae0c098-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.937547 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0859b4b6-893b-4163-980f-79c27966ed84-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:05 crc kubenswrapper[5184]: I0312 17:07:05.937555 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v2dvj\" (UniqueName: \"kubernetes.io/projected/0859b4b6-893b-4163-980f-79c27966ed84-kube-api-access-v2dvj\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.038990 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22558944-a035-4296-855e-53505b918f08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.039049 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22558944-a035-4296-855e-53505b918f08-config\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.039095 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4xzx\" (UniqueName: \"kubernetes.io/projected/22558944-a035-4296-855e-53505b918f08-kube-api-access-g4xzx\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.039261 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22558944-a035-4296-855e-53505b918f08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.039507 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22558944-a035-4296-855e-53505b918f08-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.039537 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22558944-a035-4296-855e-53505b918f08-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.039572 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.039651 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22558944-a035-4296-855e-53505b918f08-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.141819 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22558944-a035-4296-855e-53505b918f08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.141883 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22558944-a035-4296-855e-53505b918f08-config\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.141925 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4xzx\" (UniqueName: \"kubernetes.io/projected/22558944-a035-4296-855e-53505b918f08-kube-api-access-g4xzx\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.141969 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22558944-a035-4296-855e-53505b918f08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.142180 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22558944-a035-4296-855e-53505b918f08-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.142210 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22558944-a035-4296-855e-53505b918f08-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.142238 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.142292 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22558944-a035-4296-855e-53505b918f08-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.142811 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22558944-a035-4296-855e-53505b918f08-config\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.142805 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.142997 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22558944-a035-4296-855e-53505b918f08-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.143711 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22558944-a035-4296-855e-53505b918f08-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.146842 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22558944-a035-4296-855e-53505b918f08-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.146944 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22558944-a035-4296-855e-53505b918f08-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.152968 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22558944-a035-4296-855e-53505b918f08-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.166449 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4xzx\" (UniqueName: \"kubernetes.io/projected/22558944-a035-4296-855e-53505b918f08-kube-api-access-g4xzx\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.174086 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"22558944-a035-4296-855e-53505b918f08\") " pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.270475 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.342156 5184 generic.go:358] "Generic (PLEG): container finished" podID="ae6b6e98-f5e4-4418-b2c1-105a52da746b" containerID="86cff2284fbea040567b46470ff9b9b785af0fa47b30e6fd9a9ef95780cdd4de" exitCode=0 Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.342262 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" event={"ID":"ae6b6e98-f5e4-4418-b2c1-105a52da746b","Type":"ContainerDied","Data":"86cff2284fbea040567b46470ff9b9b785af0fa47b30e6fd9a9ef95780cdd4de"} Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.345020 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"37dd5ca0-dd94-458b-93c2-393f9c4db4b7","Type":"ContainerStarted","Data":"2d2706396f0780be9dfe3bf59d0670ad011709be1e6f4de1b9254ef23bc223e5"} Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.346700 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"01a0600d-d61f-4822-a177-fbe86d075f38","Type":"ContainerStarted","Data":"c4729b7b60320aa4c4376f312564afa02e1f21c34e2f4aa9e68cc67f384483ed"} Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.348041 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dq7bv" event={"ID":"d5a0c031-5c42-4559-96f2-82b75e70b804","Type":"ContainerStarted","Data":"2e9aa3179391d36ddb5856392c065823309eaf6acac77daa7564e40dd27577ae"} Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.349656 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" event={"ID":"d9b38e02-6553-49b5-a966-f266cae0c098","Type":"ContainerDied","Data":"a5c60b3dc5b0f97a3c8a36852a35fd067e5029cf9daaf06c42dcecd35afca434"} Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.349690 5184 scope.go:117] "RemoveContainer" containerID="97b8d909065f4cbf48b8e37d1de4800137e991bac260ff92a5534e2c115a337b" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.349820 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f8cccd557-57kzz" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.354013 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.354178 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbf879849-dqv7z" event={"ID":"0859b4b6-893b-4163-980f-79c27966ed84","Type":"ContainerDied","Data":"75fa6fde71f4cd21d4357cfb3369d0360219a73808b8fa5c172c4f8fac758728"} Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.355634 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7755768b-45c3-4cab-be56-d9be437d70d1","Type":"ContainerStarted","Data":"1803991ec308dba96ddf183469890886db16542bd91db6aa2db34603b89e1083"} Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.365353 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" event={"ID":"a6cce4a5-d42c-4599-9319-30e850b844f5","Type":"ContainerStarted","Data":"23c501bb01a1d9530c9bebbac8cdf1029b7f9e84ac59e4f2c52258f70e1429d4"} Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.366593 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.388330 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" podStartSLOduration=13.874584042 podStartE2EDuration="14.388312285s" podCreationTimestamp="2026-03-12 17:06:52 +0000 UTC" firstStartedPulling="2026-03-12 17:07:04.450900604 +0000 UTC m=+966.992211943" lastFinishedPulling="2026-03-12 17:07:04.964628847 +0000 UTC m=+967.505940186" observedRunningTime="2026-03-12 17:07:06.387066086 +0000 UTC m=+968.928377425" watchObservedRunningTime="2026-03-12 17:07:06.388312285 +0000 UTC m=+968.929623624" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.432374 5184 scope.go:117] "RemoveContainer" containerID="57c2c0ec579426ab25488d628beacfdbdde822545e35d09ac59d1ca1d158fd52" Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.458409 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-57kzz"] Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.474037 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f8cccd557-57kzz"] Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.506043 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-dqv7z"] Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.520801 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dbf879849-dqv7z"] Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.560231 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vp7v2"] Mar 12 17:07:06 crc kubenswrapper[5184]: I0312 17:07:06.847276 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 17:07:06 crc kubenswrapper[5184]: W0312 17:07:06.904055 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303ab57c_305d_48c2_a789_7a124144968d.slice/crio-a7a8087ec034f4851b2da9e7ac558e9f87d9a3b4077c990f5ae1a0bc7f53c55c WatchSource:0}: Error finding container a7a8087ec034f4851b2da9e7ac558e9f87d9a3b4077c990f5ae1a0bc7f53c55c: Status 404 returned error can't find the container with id a7a8087ec034f4851b2da9e7ac558e9f87d9a3b4077c990f5ae1a0bc7f53c55c Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.106274 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nx6zg"] Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.119896 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.123101 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovncontroller-metrics-config\"" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.158523 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.158641 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-combined-ca-bundle\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.158682 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-ovs-rundir\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.158706 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-ovn-rundir\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.158775 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-config\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.158848 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqn6c\" (UniqueName: \"kubernetes.io/projected/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-kube-api-access-lqn6c\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.191428 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nx6zg"] Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.262563 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.262825 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-combined-ca-bundle\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.262879 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-ovs-rundir\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.262900 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-ovn-rundir\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.262937 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-config\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.263073 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqn6c\" (UniqueName: \"kubernetes.io/projected/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-kube-api-access-lqn6c\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.263575 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-ovs-rundir\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.263585 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-ovn-rundir\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.264307 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-config\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.272024 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.274196 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-combined-ca-bundle\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.285708 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqn6c\" (UniqueName: \"kubernetes.io/projected/afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8-kube-api-access-lqn6c\") pod \"ovn-controller-metrics-nx6zg\" (UID: \"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8\") " pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.311660 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-67jsf"] Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.361356 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db5dbf9b5-rw74w"] Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.367666 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.369860 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovsdbserver-nb\"" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.394433 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5dbf9b5-rw74w"] Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.412788 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vp7v2" event={"ID":"303ab57c-305d-48c2-a789-7a124144968d","Type":"ContainerStarted","Data":"a7a8087ec034f4851b2da9e7ac558e9f87d9a3b4077c990f5ae1a0bc7f53c55c"} Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.465840 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-ovsdbserver-nb\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.465942 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-dns-svc\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.465965 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-config\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.466034 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl6mr\" (UniqueName: \"kubernetes.io/projected/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-kube-api-access-gl6mr\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.466256 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" event={"ID":"ae6b6e98-f5e4-4418-b2c1-105a52da746b","Type":"ContainerStarted","Data":"8c544215b03b132d6159b289112aadbbc64788abf01bdb4479b75f56d6d085b8"} Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.466617 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.480243 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nx6zg" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.508736 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" podStartSLOduration=15.508715266 podStartE2EDuration="15.508715266s" podCreationTimestamp="2026-03-12 17:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:07:07.507128516 +0000 UTC m=+970.048439865" watchObservedRunningTime="2026-03-12 17:07:07.508715266 +0000 UTC m=+970.050026605" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.570643 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl6mr\" (UniqueName: \"kubernetes.io/projected/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-kube-api-access-gl6mr\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.571266 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-ovsdbserver-nb\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.571458 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-dns-svc\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.572329 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-config\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.572331 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-dns-svc\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.573915 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-config\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.576100 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-ovsdbserver-nb\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.612308 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl6mr\" (UniqueName: \"kubernetes.io/projected/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-kube-api-access-gl6mr\") pod \"dnsmasq-dns-db5dbf9b5-rw74w\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.663436 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-4nmdf"] Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.699004 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.711587 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59bc98f85f-g4qsm"] Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.728251 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59bc98f85f-g4qsm"] Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.728360 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.730329 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovsdbserver-sb\"" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.776160 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-dns-svc\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.776216 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-nb\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.776254 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-sb\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.776284 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzk77\" (UniqueName: \"kubernetes.io/projected/36632b13-d3b1-4c31-864c-c6f0d31cb057-kube-api-access-rzk77\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.776364 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-config\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.878141 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-config\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.878241 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-dns-svc\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.878282 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-nb\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.878325 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-sb\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.878356 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzk77\" (UniqueName: \"kubernetes.io/projected/36632b13-d3b1-4c31-864c-c6f0d31cb057-kube-api-access-rzk77\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.880782 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-dns-svc\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.880780 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-config\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.881489 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-nb\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.882927 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-sb\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:07 crc kubenswrapper[5184]: I0312 17:07:07.900615 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzk77\" (UniqueName: \"kubernetes.io/projected/36632b13-d3b1-4c31-864c-c6f0d31cb057-kube-api-access-rzk77\") pod \"dnsmasq-dns-59bc98f85f-g4qsm\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:08 crc kubenswrapper[5184]: I0312 17:07:08.056020 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:08 crc kubenswrapper[5184]: I0312 17:07:08.418473 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0859b4b6-893b-4163-980f-79c27966ed84" path="/var/lib/kubelet/pods/0859b4b6-893b-4163-980f-79c27966ed84/volumes" Mar 12 17:07:08 crc kubenswrapper[5184]: I0312 17:07:08.419099 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b38e02-6553-49b5-a966-f266cae0c098" path="/var/lib/kubelet/pods/d9b38e02-6553-49b5-a966-f266cae0c098/volumes" Mar 12 17:07:08 crc kubenswrapper[5184]: I0312 17:07:08.480765 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" podUID="a6cce4a5-d42c-4599-9319-30e850b844f5" containerName="dnsmasq-dns" containerID="cri-o://23c501bb01a1d9530c9bebbac8cdf1029b7f9e84ac59e4f2c52258f70e1429d4" gracePeriod=10 Mar 12 17:07:09 crc kubenswrapper[5184]: W0312 17:07:09.075098 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22558944_a035_4296_855e_53505b918f08.slice/crio-d2b1f023be69a311cdad3424cc84dbe80936f12043857d8c14afd3534a1027a5 WatchSource:0}: Error finding container d2b1f023be69a311cdad3424cc84dbe80936f12043857d8c14afd3534a1027a5: Status 404 returned error can't find the container with id d2b1f023be69a311cdad3424cc84dbe80936f12043857d8c14afd3534a1027a5 Mar 12 17:07:09 crc kubenswrapper[5184]: I0312 17:07:09.490185 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"22558944-a035-4296-855e-53505b918f08","Type":"ContainerStarted","Data":"d2b1f023be69a311cdad3424cc84dbe80936f12043857d8c14afd3534a1027a5"} Mar 12 17:07:09 crc kubenswrapper[5184]: I0312 17:07:09.491980 5184 generic.go:358] "Generic (PLEG): container finished" podID="a6cce4a5-d42c-4599-9319-30e850b844f5" containerID="23c501bb01a1d9530c9bebbac8cdf1029b7f9e84ac59e4f2c52258f70e1429d4" exitCode=0 Mar 12 17:07:09 crc kubenswrapper[5184]: I0312 17:07:09.492068 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" event={"ID":"a6cce4a5-d42c-4599-9319-30e850b844f5","Type":"ContainerDied","Data":"23c501bb01a1d9530c9bebbac8cdf1029b7f9e84ac59e4f2c52258f70e1429d4"} Mar 12 17:07:09 crc kubenswrapper[5184]: I0312 17:07:09.492367 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" podUID="ae6b6e98-f5e4-4418-b2c1-105a52da746b" containerName="dnsmasq-dns" containerID="cri-o://8c544215b03b132d6159b289112aadbbc64788abf01bdb4479b75f56d6d085b8" gracePeriod=10 Mar 12 17:07:10 crc kubenswrapper[5184]: I0312 17:07:10.499910 5184 generic.go:358] "Generic (PLEG): container finished" podID="ae6b6e98-f5e4-4418-b2c1-105a52da746b" containerID="8c544215b03b132d6159b289112aadbbc64788abf01bdb4479b75f56d6d085b8" exitCode=0 Mar 12 17:07:10 crc kubenswrapper[5184]: I0312 17:07:10.500096 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" event={"ID":"ae6b6e98-f5e4-4418-b2c1-105a52da746b","Type":"ContainerDied","Data":"8c544215b03b132d6159b289112aadbbc64788abf01bdb4479b75f56d6d085b8"} Mar 12 17:07:12 crc kubenswrapper[5184]: I0312 17:07:12.523182 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" event={"ID":"a6cce4a5-d42c-4599-9319-30e850b844f5","Type":"ContainerDied","Data":"f57efa67c4449e962e273e0c64283dd360c0d66431bf7b7c50f1e42cd503c9b4"} Mar 12 17:07:12 crc kubenswrapper[5184]: I0312 17:07:12.523529 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f57efa67c4449e962e273e0c64283dd360c0d66431bf7b7c50f1e42cd503c9b4" Mar 12 17:07:12 crc kubenswrapper[5184]: I0312 17:07:12.612804 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:07:12 crc kubenswrapper[5184]: I0312 17:07:12.769151 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt6hg\" (UniqueName: \"kubernetes.io/projected/a6cce4a5-d42c-4599-9319-30e850b844f5-kube-api-access-xt6hg\") pod \"a6cce4a5-d42c-4599-9319-30e850b844f5\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " Mar 12 17:07:12 crc kubenswrapper[5184]: I0312 17:07:12.769628 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-config\") pod \"a6cce4a5-d42c-4599-9319-30e850b844f5\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " Mar 12 17:07:12 crc kubenswrapper[5184]: I0312 17:07:12.769664 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-dns-svc\") pod \"a6cce4a5-d42c-4599-9319-30e850b844f5\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " Mar 12 17:07:12 crc kubenswrapper[5184]: I0312 17:07:12.775640 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6cce4a5-d42c-4599-9319-30e850b844f5-kube-api-access-xt6hg" (OuterVolumeSpecName: "kube-api-access-xt6hg") pod "a6cce4a5-d42c-4599-9319-30e850b844f5" (UID: "a6cce4a5-d42c-4599-9319-30e850b844f5"). InnerVolumeSpecName "kube-api-access-xt6hg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:12 crc kubenswrapper[5184]: E0312 17:07:12.823030 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-dns-svc podName:a6cce4a5-d42c-4599-9319-30e850b844f5 nodeName:}" failed. No retries permitted until 2026-03-12 17:07:13.322991784 +0000 UTC m=+975.864303133 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-dns-svc") pod "a6cce4a5-d42c-4599-9319-30e850b844f5" (UID: "a6cce4a5-d42c-4599-9319-30e850b844f5") : error deleting /var/lib/kubelet/pods/a6cce4a5-d42c-4599-9319-30e850b844f5/volume-subpaths: remove /var/lib/kubelet/pods/a6cce4a5-d42c-4599-9319-30e850b844f5/volume-subpaths: no such file or directory Mar 12 17:07:12 crc kubenswrapper[5184]: I0312 17:07:12.823082 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-config" (OuterVolumeSpecName: "config") pod "a6cce4a5-d42c-4599-9319-30e850b844f5" (UID: "a6cce4a5-d42c-4599-9319-30e850b844f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:12 crc kubenswrapper[5184]: I0312 17:07:12.871796 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:12 crc kubenswrapper[5184]: I0312 17:07:12.871832 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xt6hg\" (UniqueName: \"kubernetes.io/projected/a6cce4a5-d42c-4599-9319-30e850b844f5-kube-api-access-xt6hg\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.266022 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.378421 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlvsr\" (UniqueName: \"kubernetes.io/projected/ae6b6e98-f5e4-4418-b2c1-105a52da746b-kube-api-access-hlvsr\") pod \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.378462 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-config\") pod \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.378492 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-dns-svc\") pod \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\" (UID: \"ae6b6e98-f5e4-4418-b2c1-105a52da746b\") " Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.378676 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-dns-svc\") pod \"a6cce4a5-d42c-4599-9319-30e850b844f5\" (UID: \"a6cce4a5-d42c-4599-9319-30e850b844f5\") " Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.379533 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6cce4a5-d42c-4599-9319-30e850b844f5" (UID: "a6cce4a5-d42c-4599-9319-30e850b844f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.385321 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae6b6e98-f5e4-4418-b2c1-105a52da746b-kube-api-access-hlvsr" (OuterVolumeSpecName: "kube-api-access-hlvsr") pod "ae6b6e98-f5e4-4418-b2c1-105a52da746b" (UID: "ae6b6e98-f5e4-4418-b2c1-105a52da746b"). InnerVolumeSpecName "kube-api-access-hlvsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.426240 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae6b6e98-f5e4-4418-b2c1-105a52da746b" (UID: "ae6b6e98-f5e4-4418-b2c1-105a52da746b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.433892 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-config" (OuterVolumeSpecName: "config") pod "ae6b6e98-f5e4-4418-b2c1-105a52da746b" (UID: "ae6b6e98-f5e4-4418-b2c1-105a52da746b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.483177 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6cce4a5-d42c-4599-9319-30e850b844f5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.483205 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hlvsr\" (UniqueName: \"kubernetes.io/projected/ae6b6e98-f5e4-4418-b2c1-105a52da746b-kube-api-access-hlvsr\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.483215 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.483223 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae6b6e98-f5e4-4418-b2c1-105a52da746b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.537223 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.537267 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6686bbb8b9-4nmdf" event={"ID":"ae6b6e98-f5e4-4418-b2c1-105a52da746b","Type":"ContainerDied","Data":"43944b0ad5fcaa8b4f82bff763f805b25ed4873ab805e4828de57368f7e4ee04"} Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.537342 5184 scope.go:117] "RemoveContainer" containerID="8c544215b03b132d6159b289112aadbbc64788abf01bdb4479b75f56d6d085b8" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.537461 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-588bd8c8c5-67jsf" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.588294 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-4nmdf"] Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.598680 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6686bbb8b9-4nmdf"] Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.604457 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-67jsf"] Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.625030 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-588bd8c8c5-67jsf"] Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.635771 5184 scope.go:117] "RemoveContainer" containerID="86cff2284fbea040567b46470ff9b9b785af0fa47b30e6fd9a9ef95780cdd4de" Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.766887 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nx6zg"] Mar 12 17:07:13 crc kubenswrapper[5184]: W0312 17:07:13.831447 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd5d8ed_916e_4ba1_bbe8_bcc7989cdbe8.slice/crio-5e688a688daeea02d5a7d2e81758655bc672d0f7bfa3c2b867c8ce862d7408fa WatchSource:0}: Error finding container 5e688a688daeea02d5a7d2e81758655bc672d0f7bfa3c2b867c8ce862d7408fa: Status 404 returned error can't find the container with id 5e688a688daeea02d5a7d2e81758655bc672d0f7bfa3c2b867c8ce862d7408fa Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.849808 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59bc98f85f-g4qsm"] Mar 12 17:07:13 crc kubenswrapper[5184]: I0312 17:07:13.992005 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5dbf9b5-rw74w"] Mar 12 17:07:14 crc kubenswrapper[5184]: W0312 17:07:14.020666 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod717b8c1b_0b4b_41e1_98a4_826a25eb2ab8.slice/crio-9ea8dc2757f6787b65597c4400a13ad44bfe16b8f78a96684cf48633287eccb8 WatchSource:0}: Error finding container 9ea8dc2757f6787b65597c4400a13ad44bfe16b8f78a96684cf48633287eccb8: Status 404 returned error can't find the container with id 9ea8dc2757f6787b65597c4400a13ad44bfe16b8f78a96684cf48633287eccb8 Mar 12 17:07:14 crc kubenswrapper[5184]: I0312 17:07:14.412866 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6cce4a5-d42c-4599-9319-30e850b844f5" path="/var/lib/kubelet/pods/a6cce4a5-d42c-4599-9319-30e850b844f5/volumes" Mar 12 17:07:14 crc kubenswrapper[5184]: I0312 17:07:14.414081 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae6b6e98-f5e4-4418-b2c1-105a52da746b" path="/var/lib/kubelet/pods/ae6b6e98-f5e4-4418-b2c1-105a52da746b/volumes" Mar 12 17:07:14 crc kubenswrapper[5184]: I0312 17:07:14.549585 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" event={"ID":"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8","Type":"ContainerStarted","Data":"9ea8dc2757f6787b65597c4400a13ad44bfe16b8f78a96684cf48633287eccb8"} Mar 12 17:07:14 crc kubenswrapper[5184]: I0312 17:07:14.552267 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vp7v2" event={"ID":"303ab57c-305d-48c2-a789-7a124144968d","Type":"ContainerStarted","Data":"e576ca830e4881a1ef7ba33e7a5d1a80325532cf4a2602f3e44130eddc2e78af"} Mar 12 17:07:14 crc kubenswrapper[5184]: I0312 17:07:14.557236 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" event={"ID":"36632b13-d3b1-4c31-864c-c6f0d31cb057","Type":"ContainerStarted","Data":"4e1bbee7ea00f2c249b02abd4029ae9ec8fb5be4a223422fdccfe243eb0bdec8"} Mar 12 17:07:14 crc kubenswrapper[5184]: I0312 17:07:14.560357 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nx6zg" event={"ID":"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8","Type":"ContainerStarted","Data":"5e688a688daeea02d5a7d2e81758655bc672d0f7bfa3c2b867c8ce862d7408fa"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.568743 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dq7bv" event={"ID":"d5a0c031-5c42-4559-96f2-82b75e70b804","Type":"ContainerStarted","Data":"20477feb5dea7c21ac927a892474ecc027ed0c91ee3558bbd7075522ee12c568"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.569061 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-controller-dq7bv" Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.571888 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"043ee884-91ea-43b8-8b26-c8e85e3df303","Type":"ContainerStarted","Data":"488afa1048fd67c3ff8e476361b1821c18434cc58febb4e21314551628514ae5"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.573471 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7755768b-45c3-4cab-be56-d9be437d70d1","Type":"ContainerStarted","Data":"19ef36f0f6b0bc8f376010507e8fa705ebb700dc5fc6aa6d071c43c2e24a3f9f"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.576011 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"22558944-a035-4296-855e-53505b918f08","Type":"ContainerStarted","Data":"2af3c0cc42d19d6eebf80583b30a7c220bcd0746f0fd630dbad68d69c3a4b61d"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.577827 5184 generic.go:358] "Generic (PLEG): container finished" podID="303ab57c-305d-48c2-a789-7a124144968d" containerID="e576ca830e4881a1ef7ba33e7a5d1a80325532cf4a2602f3e44130eddc2e78af" exitCode=0 Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.578021 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vp7v2" event={"ID":"303ab57c-305d-48c2-a789-7a124144968d","Type":"ContainerDied","Data":"e576ca830e4881a1ef7ba33e7a5d1a80325532cf4a2602f3e44130eddc2e78af"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.579869 5184 generic.go:358] "Generic (PLEG): container finished" podID="36632b13-d3b1-4c31-864c-c6f0d31cb057" containerID="84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6" exitCode=0 Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.579973 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" event={"ID":"36632b13-d3b1-4c31-864c-c6f0d31cb057","Type":"ContainerDied","Data":"84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.587747 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9c743985-027b-46df-8a0d-5a246406a2d3","Type":"ContainerStarted","Data":"045305afce74f51e367cbe374258c2e61f89ba3b2d1bf5cc2bd897779a7ee4c1"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.597687 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" event={"ID":"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8","Type":"ContainerStarted","Data":"2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.597698 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dq7bv" podStartSLOduration=6.551372129 podStartE2EDuration="14.597675113s" podCreationTimestamp="2026-03-12 17:07:01 +0000 UTC" firstStartedPulling="2026-03-12 17:07:05.590194349 +0000 UTC m=+968.131505688" lastFinishedPulling="2026-03-12 17:07:13.636497333 +0000 UTC m=+976.177808672" observedRunningTime="2026-03-12 17:07:15.591755967 +0000 UTC m=+978.133067326" watchObservedRunningTime="2026-03-12 17:07:15.597675113 +0000 UTC m=+978.138986452" Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.600324 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"37dd5ca0-dd94-458b-93c2-393f9c4db4b7","Type":"ContainerStarted","Data":"8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.602544 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"01a0600d-d61f-4822-a177-fbe86d075f38","Type":"ContainerStarted","Data":"1a5a0ee3092bf81f4071c5ef3664da4150e8655c7fb6ac6491b706bf6bfaa68a"} Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.603110 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/memcached-0" Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.718922 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.94562178 podStartE2EDuration="19.718905139s" podCreationTimestamp="2026-03-12 17:06:56 +0000 UTC" firstStartedPulling="2026-03-12 17:07:05.620991279 +0000 UTC m=+968.162302618" lastFinishedPulling="2026-03-12 17:07:13.394274638 +0000 UTC m=+975.935585977" observedRunningTime="2026-03-12 17:07:15.718715613 +0000 UTC m=+978.260026972" watchObservedRunningTime="2026-03-12 17:07:15.718905139 +0000 UTC m=+978.260216478" Mar 12 17:07:15 crc kubenswrapper[5184]: I0312 17:07:15.739276 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.779249152 podStartE2EDuration="17.73925185s" podCreationTimestamp="2026-03-12 17:06:58 +0000 UTC" firstStartedPulling="2026-03-12 17:07:05.587205165 +0000 UTC m=+968.128516494" lastFinishedPulling="2026-03-12 17:07:14.547207843 +0000 UTC m=+977.088519192" observedRunningTime="2026-03-12 17:07:15.732163857 +0000 UTC m=+978.273475206" watchObservedRunningTime="2026-03-12 17:07:15.73925185 +0000 UTC m=+978.280563179" Mar 12 17:07:16 crc kubenswrapper[5184]: I0312 17:07:16.617401 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vp7v2" event={"ID":"303ab57c-305d-48c2-a789-7a124144968d","Type":"ContainerStarted","Data":"d75cd22abc889b154d62700d45196d5f1287c5e57471cdadcc2f01d9909ce76a"} Mar 12 17:07:16 crc kubenswrapper[5184]: I0312 17:07:16.619866 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" event={"ID":"36632b13-d3b1-4c31-864c-c6f0d31cb057","Type":"ContainerStarted","Data":"aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd"} Mar 12 17:07:16 crc kubenswrapper[5184]: I0312 17:07:16.620446 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:16 crc kubenswrapper[5184]: I0312 17:07:16.622285 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56b9c26f-b490-4262-9c35-63ee5734c634","Type":"ContainerStarted","Data":"5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22"} Mar 12 17:07:16 crc kubenswrapper[5184]: I0312 17:07:16.624920 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53e57ab8-13e6-4505-a905-412d3ef88083","Type":"ContainerStarted","Data":"1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464"} Mar 12 17:07:16 crc kubenswrapper[5184]: I0312 17:07:16.626309 5184 generic.go:358] "Generic (PLEG): container finished" podID="717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" containerID="2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb" exitCode=0 Mar 12 17:07:16 crc kubenswrapper[5184]: I0312 17:07:16.626477 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" event={"ID":"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8","Type":"ContainerDied","Data":"2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb"} Mar 12 17:07:16 crc kubenswrapper[5184]: I0312 17:07:16.627082 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/kube-state-metrics-0" Mar 12 17:07:16 crc kubenswrapper[5184]: I0312 17:07:16.676584 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" podStartSLOduration=9.676565867 podStartE2EDuration="9.676565867s" podCreationTimestamp="2026-03-12 17:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:07:16.640727319 +0000 UTC m=+979.182038658" watchObservedRunningTime="2026-03-12 17:07:16.676565867 +0000 UTC m=+979.217877206" Mar 12 17:07:18 crc kubenswrapper[5184]: I0312 17:07:18.645919 5184 generic.go:358] "Generic (PLEG): container finished" podID="043ee884-91ea-43b8-8b26-c8e85e3df303" containerID="488afa1048fd67c3ff8e476361b1821c18434cc58febb4e21314551628514ae5" exitCode=0 Mar 12 17:07:18 crc kubenswrapper[5184]: I0312 17:07:18.645991 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"043ee884-91ea-43b8-8b26-c8e85e3df303","Type":"ContainerDied","Data":"488afa1048fd67c3ff8e476361b1821c18434cc58febb4e21314551628514ae5"} Mar 12 17:07:20 crc kubenswrapper[5184]: I0312 17:07:20.672921 5184 generic.go:358] "Generic (PLEG): container finished" podID="9c743985-027b-46df-8a0d-5a246406a2d3" containerID="045305afce74f51e367cbe374258c2e61f89ba3b2d1bf5cc2bd897779a7ee4c1" exitCode=0 Mar 12 17:07:20 crc kubenswrapper[5184]: I0312 17:07:20.674309 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9c743985-027b-46df-8a0d-5a246406a2d3","Type":"ContainerDied","Data":"045305afce74f51e367cbe374258c2e61f89ba3b2d1bf5cc2bd897779a7ee4c1"} Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.637697 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.638762 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.697868 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9c743985-027b-46df-8a0d-5a246406a2d3","Type":"ContainerStarted","Data":"b0f4878fbf3e84206464935257be824ba7be7b3bb407c5d99abf79a1db6be43e"} Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.705756 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" event={"ID":"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8","Type":"ContainerStarted","Data":"e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d"} Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.705794 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.707825 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"043ee884-91ea-43b8-8b26-c8e85e3df303","Type":"ContainerStarted","Data":"3d02aeaca8035cc2d0031bbb45bfe6e46f3cdf9467fee80c16f9593acaf7c169"} Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.710623 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7755768b-45c3-4cab-be56-d9be437d70d1","Type":"ContainerStarted","Data":"e1de0a7322fc4efbee4438de508633e320ba0c69e080708ae9400fc6bc05d012"} Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.712836 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"22558944-a035-4296-855e-53505b918f08","Type":"ContainerStarted","Data":"7bc938ae75f29ef609a05e9a461e5c339864d5b848c8f5bcc5d507327d5f6026"} Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.715210 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vp7v2" event={"ID":"303ab57c-305d-48c2-a789-7a124144968d","Type":"ContainerStarted","Data":"414974facf42ea4419e243a98bd959a63e46915fcecd0cf91c2441cb6870bf87"} Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.715650 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.715677 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.729329 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5dbf9b5-rw74w"] Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.753568 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.211548156 podStartE2EDuration="28.753543224s" podCreationTimestamp="2026-03-12 17:06:54 +0000 UTC" firstStartedPulling="2026-03-12 17:07:05.083797808 +0000 UTC m=+967.625109137" lastFinishedPulling="2026-03-12 17:07:13.625792866 +0000 UTC m=+976.167104205" observedRunningTime="2026-03-12 17:07:22.747806854 +0000 UTC m=+985.289118193" watchObservedRunningTime="2026-03-12 17:07:22.753543224 +0000 UTC m=+985.294854563" Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.782982 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.091992227 podStartE2EDuration="21.782965561s" podCreationTimestamp="2026-03-12 17:07:01 +0000 UTC" firstStartedPulling="2026-03-12 17:07:05.698681864 +0000 UTC m=+968.239993203" lastFinishedPulling="2026-03-12 17:07:22.389655188 +0000 UTC m=+984.930966537" observedRunningTime="2026-03-12 17:07:22.781256607 +0000 UTC m=+985.322567976" watchObservedRunningTime="2026-03-12 17:07:22.782965561 +0000 UTC m=+985.324276900" Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.808110 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-vp7v2" podStartSLOduration=15.204126925 podStartE2EDuration="21.808089192s" podCreationTimestamp="2026-03-12 17:07:01 +0000 UTC" firstStartedPulling="2026-03-12 17:07:06.906493748 +0000 UTC m=+969.447805087" lastFinishedPulling="2026-03-12 17:07:13.510456015 +0000 UTC m=+976.051767354" observedRunningTime="2026-03-12 17:07:22.804479848 +0000 UTC m=+985.345791197" watchObservedRunningTime="2026-03-12 17:07:22.808089192 +0000 UTC m=+985.349400531" Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.830047 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.567411858 podStartE2EDuration="29.830023272s" podCreationTimestamp="2026-03-12 17:06:53 +0000 UTC" firstStartedPulling="2026-03-12 17:07:05.084004764 +0000 UTC m=+967.625316103" lastFinishedPulling="2026-03-12 17:07:13.346616178 +0000 UTC m=+975.887927517" observedRunningTime="2026-03-12 17:07:22.82773066 +0000 UTC m=+985.369041999" watchObservedRunningTime="2026-03-12 17:07:22.830023272 +0000 UTC m=+985.371334611" Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.855349 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.571976988 podStartE2EDuration="18.855323888s" podCreationTimestamp="2026-03-12 17:07:04 +0000 UTC" firstStartedPulling="2026-03-12 17:07:09.087215948 +0000 UTC m=+971.628527287" lastFinishedPulling="2026-03-12 17:07:22.370562848 +0000 UTC m=+984.911874187" observedRunningTime="2026-03-12 17:07:22.848661139 +0000 UTC m=+985.389972478" watchObservedRunningTime="2026-03-12 17:07:22.855323888 +0000 UTC m=+985.396635227" Mar 12 17:07:22 crc kubenswrapper[5184]: I0312 17:07:22.885003 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" podStartSLOduration=15.884955792 podStartE2EDuration="15.884955792s" podCreationTimestamp="2026-03-12 17:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:07:22.873029096 +0000 UTC m=+985.414340445" watchObservedRunningTime="2026-03-12 17:07:22.884955792 +0000 UTC m=+985.426267141" Mar 12 17:07:23 crc kubenswrapper[5184]: I0312 17:07:23.575626 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:23 crc kubenswrapper[5184]: I0312 17:07:23.617749 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:23 crc kubenswrapper[5184]: I0312 17:07:23.728795 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nx6zg" event={"ID":"afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8","Type":"ContainerStarted","Data":"52de14c0bc8330574d0d781986ee989d5e5f7f664be2c29d34bf042863d9966d"} Mar 12 17:07:23 crc kubenswrapper[5184]: I0312 17:07:23.729915 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:23 crc kubenswrapper[5184]: I0312 17:07:23.753296 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nx6zg" podStartSLOduration=8.197698831 podStartE2EDuration="16.753273626s" podCreationTimestamp="2026-03-12 17:07:07 +0000 UTC" firstStartedPulling="2026-03-12 17:07:13.834095434 +0000 UTC m=+976.375406773" lastFinishedPulling="2026-03-12 17:07:22.389670229 +0000 UTC m=+984.930981568" observedRunningTime="2026-03-12 17:07:23.747423992 +0000 UTC m=+986.288735341" watchObservedRunningTime="2026-03-12 17:07:23.753273626 +0000 UTC m=+986.294584965" Mar 12 17:07:23 crc kubenswrapper[5184]: I0312 17:07:23.778893 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.270683 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.307536 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.738947 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" podUID="717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" containerName="dnsmasq-dns" containerID="cri-o://e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d" gracePeriod=10 Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.738956 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.809221 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.990438 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.998542 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae6b6e98-f5e4-4418-b2c1-105a52da746b" containerName="init" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.998575 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae6b6e98-f5e4-4418-b2c1-105a52da746b" containerName="init" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.998587 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae6b6e98-f5e4-4418-b2c1-105a52da746b" containerName="dnsmasq-dns" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.998593 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae6b6e98-f5e4-4418-b2c1-105a52da746b" containerName="dnsmasq-dns" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.998607 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6cce4a5-d42c-4599-9319-30e850b844f5" containerName="init" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.998613 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cce4a5-d42c-4599-9319-30e850b844f5" containerName="init" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.998622 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6cce4a5-d42c-4599-9319-30e850b844f5" containerName="dnsmasq-dns" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.998628 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cce4a5-d42c-4599-9319-30e850b844f5" containerName="dnsmasq-dns" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.998781 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6cce4a5-d42c-4599-9319-30e850b844f5" containerName="dnsmasq-dns" Mar 12 17:07:24 crc kubenswrapper[5184]: I0312 17:07:24.998802 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae6b6e98-f5e4-4418-b2c1-105a52da746b" containerName="dnsmasq-dns" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.003623 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.003742 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.008427 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovnnorthd-config\"" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.008914 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ovnnorthd-ovndbs\"" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.009080 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovnnorthd-scripts\"" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.009272 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ovnnorthd-ovnnorthd-dockercfg-2vfzh\"" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.027517 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/openstack-galera-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.028583 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.117325 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a455b874-8c77-4293-be4f-4379f4fecf49-scripts\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.117469 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a455b874-8c77-4293-be4f-4379f4fecf49-config\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.117517 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a455b874-8c77-4293-be4f-4379f4fecf49-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.117607 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a455b874-8c77-4293-be4f-4379f4fecf49-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.117690 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a455b874-8c77-4293-be4f-4379f4fecf49-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.117826 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a455b874-8c77-4293-be4f-4379f4fecf49-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.117991 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxdp\" (UniqueName: \"kubernetes.io/projected/a455b874-8c77-4293-be4f-4379f4fecf49-kube-api-access-zrxdp\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.219458 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxdp\" (UniqueName: \"kubernetes.io/projected/a455b874-8c77-4293-be4f-4379f4fecf49-kube-api-access-zrxdp\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.219522 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a455b874-8c77-4293-be4f-4379f4fecf49-scripts\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.219557 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a455b874-8c77-4293-be4f-4379f4fecf49-config\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.219574 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a455b874-8c77-4293-be4f-4379f4fecf49-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.219605 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a455b874-8c77-4293-be4f-4379f4fecf49-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.219641 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a455b874-8c77-4293-be4f-4379f4fecf49-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.219675 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a455b874-8c77-4293-be4f-4379f4fecf49-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.221241 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a455b874-8c77-4293-be4f-4379f4fecf49-config\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.221773 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a455b874-8c77-4293-be4f-4379f4fecf49-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.222602 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a455b874-8c77-4293-be4f-4379f4fecf49-scripts\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.226462 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a455b874-8c77-4293-be4f-4379f4fecf49-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.228650 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a455b874-8c77-4293-be4f-4379f4fecf49-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.229311 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a455b874-8c77-4293-be4f-4379f4fecf49-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.240484 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxdp\" (UniqueName: \"kubernetes.io/projected/a455b874-8c77-4293-be4f-4379f4fecf49-kube-api-access-zrxdp\") pod \"ovn-northd-0\" (UID: \"a455b874-8c77-4293-be4f-4379f4fecf49\") " pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.329004 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.336165 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.421200 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-ovsdbserver-nb\") pod \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.421241 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-dns-svc\") pod \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.421326 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-config\") pod \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.421435 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl6mr\" (UniqueName: \"kubernetes.io/projected/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-kube-api-access-gl6mr\") pod \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\" (UID: \"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8\") " Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.427312 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-kube-api-access-gl6mr" (OuterVolumeSpecName: "kube-api-access-gl6mr") pod "717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" (UID: "717b8c1b-0b4b-41e1-98a4-826a25eb2ab8"). InnerVolumeSpecName "kube-api-access-gl6mr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.464917 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-config" (OuterVolumeSpecName: "config") pod "717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" (UID: "717b8c1b-0b4b-41e1-98a4-826a25eb2ab8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.477449 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" (UID: "717b8c1b-0b4b-41e1-98a4-826a25eb2ab8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.481765 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" (UID: "717b8c1b-0b4b-41e1-98a4-826a25eb2ab8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.523271 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gl6mr\" (UniqueName: \"kubernetes.io/projected/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-kube-api-access-gl6mr\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.523309 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.523318 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.523327 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.748535 5184 generic.go:358] "Generic (PLEG): container finished" podID="717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" containerID="e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d" exitCode=0 Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.748591 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" event={"ID":"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8","Type":"ContainerDied","Data":"e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d"} Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.748650 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.748674 5184 scope.go:117] "RemoveContainer" containerID="e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.748660 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dbf9b5-rw74w" event={"ID":"717b8c1b-0b4b-41e1-98a4-826a25eb2ab8","Type":"ContainerDied","Data":"9ea8dc2757f6787b65597c4400a13ad44bfe16b8f78a96684cf48633287eccb8"} Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.770334 5184 scope.go:117] "RemoveContainer" containerID="2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.778929 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.794925 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5dbf9b5-rw74w"] Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.801115 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db5dbf9b5-rw74w"] Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.805526 5184 scope.go:117] "RemoveContainer" containerID="e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d" Mar 12 17:07:25 crc kubenswrapper[5184]: E0312 17:07:25.806021 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d\": container with ID starting with e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d not found: ID does not exist" containerID="e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.806076 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d"} err="failed to get container status \"e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d\": rpc error: code = NotFound desc = could not find container \"e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d\": container with ID starting with e4c1ef784b11150a30d96a9fe67ec1606452a4bccfa41b082732d632bbc99d4d not found: ID does not exist" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.806107 5184 scope.go:117] "RemoveContainer" containerID="2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb" Mar 12 17:07:25 crc kubenswrapper[5184]: E0312 17:07:25.806483 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb\": container with ID starting with 2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb not found: ID does not exist" containerID="2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb" Mar 12 17:07:25 crc kubenswrapper[5184]: I0312 17:07:25.806517 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb"} err="failed to get container status \"2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb\": rpc error: code = NotFound desc = could not find container \"2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb\": container with ID starting with 2aed4412ae6f9ee9957c4afe752721872b3d53747cb49a7218ed1edbc84526eb not found: ID does not exist" Mar 12 17:07:26 crc kubenswrapper[5184]: I0312 17:07:26.359429 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 12 17:07:26 crc kubenswrapper[5184]: I0312 17:07:26.360475 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/openstack-cell1-galera-0" Mar 12 17:07:26 crc kubenswrapper[5184]: I0312 17:07:26.411175 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" path="/var/lib/kubelet/pods/717b8c1b-0b4b-41e1-98a4-826a25eb2ab8/volumes" Mar 12 17:07:26 crc kubenswrapper[5184]: I0312 17:07:26.772182 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a455b874-8c77-4293-be4f-4379f4fecf49","Type":"ContainerStarted","Data":"5f31004b7ff975de02d4418d054999316260e3276973eadc77b6c73d619ecb7d"} Mar 12 17:07:27 crc kubenswrapper[5184]: I0312 17:07:27.642109 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 12 17:07:27 crc kubenswrapper[5184]: I0312 17:07:27.644727 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 12 17:07:27 crc kubenswrapper[5184]: I0312 17:07:27.773481 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 12 17:07:27 crc kubenswrapper[5184]: I0312 17:07:27.784414 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a455b874-8c77-4293-be4f-4379f4fecf49","Type":"ContainerStarted","Data":"1b756a943beb7e4b58869e3f8a4d1ff2b5cea171ec5949ea7a8e234076ba7f02"} Mar 12 17:07:27 crc kubenswrapper[5184]: I0312 17:07:27.784466 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a455b874-8c77-4293-be4f-4379f4fecf49","Type":"ContainerStarted","Data":"658b468283c336dd3921ac1882cb73890481fa1941c88822e063e534fc9d07b7"} Mar 12 17:07:27 crc kubenswrapper[5184]: I0312 17:07:27.784574 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ovn-northd-0" Mar 12 17:07:27 crc kubenswrapper[5184]: I0312 17:07:27.829666 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.404853731 podStartE2EDuration="3.829639604s" podCreationTimestamp="2026-03-12 17:07:24 +0000 UTC" firstStartedPulling="2026-03-12 17:07:25.790099799 +0000 UTC m=+988.331411138" lastFinishedPulling="2026-03-12 17:07:27.214885672 +0000 UTC m=+989.756197011" observedRunningTime="2026-03-12 17:07:27.824347818 +0000 UTC m=+990.365659167" watchObservedRunningTime="2026-03-12 17:07:27.829639604 +0000 UTC m=+990.370950943" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.830246 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6565fc964f-vn8ss"] Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.831750 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" containerName="dnsmasq-dns" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.831773 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" containerName="dnsmasq-dns" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.831798 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" containerName="init" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.831806 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" containerName="init" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.831990 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="717b8c1b-0b4b-41e1-98a4-826a25eb2ab8" containerName="dnsmasq-dns" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.841430 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.847013 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6565fc964f-vn8ss"] Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.882122 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-config\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.882271 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-nb\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.882303 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kfqx\" (UniqueName: \"kubernetes.io/projected/6be46ed7-a6b6-4b6e-9934-3540b1867032-kube-api-access-2kfqx\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.882413 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-dns-svc\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.882461 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-sb\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.984217 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-config\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.984333 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-nb\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.984404 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kfqx\" (UniqueName: \"kubernetes.io/projected/6be46ed7-a6b6-4b6e-9934-3540b1867032-kube-api-access-2kfqx\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.984482 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-dns-svc\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.984519 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-sb\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.985133 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-config\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.985750 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-sb\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.986017 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-nb\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:28 crc kubenswrapper[5184]: I0312 17:07:28.986643 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-dns-svc\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.005953 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kfqx\" (UniqueName: \"kubernetes.io/projected/6be46ed7-a6b6-4b6e-9934-3540b1867032-kube-api-access-2kfqx\") pod \"dnsmasq-dns-6565fc964f-vn8ss\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.158952 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.502524 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.569813 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6565fc964f-vn8ss"] Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.591425 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.805531 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" event={"ID":"6be46ed7-a6b6-4b6e-9934-3540b1867032","Type":"ContainerStarted","Data":"08890ab80e9826f22128ea18c42d0bf35edc4e124dc38eb581b94dab4860777f"} Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.959572 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.974051 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.976030 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"swift-swift-dockercfg-fs7b7\"" Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.976109 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"swift-conf\"" Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.976248 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"swift-ring-files\"" Mar 12 17:07:29 crc kubenswrapper[5184]: I0312 17:07:29.976346 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"swift-storage-config-data\"" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.002985 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.003039 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2ffae81b-589d-4502-a0a6-777b8d6f98b1-cache\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.003053 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.003206 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffae81b-589d-4502-a0a6-777b8d6f98b1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.003281 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7lz2\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-kube-api-access-w7lz2\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.003432 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2ffae81b-589d-4502-a0a6-777b8d6f98b1-lock\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.003472 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.106212 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.106259 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2ffae81b-589d-4502-a0a6-777b8d6f98b1-cache\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.106339 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffae81b-589d-4502-a0a6-777b8d6f98b1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.106405 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7lz2\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-kube-api-access-w7lz2\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.106466 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2ffae81b-589d-4502-a0a6-777b8d6f98b1-lock\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.106505 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.106854 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: E0312 17:07:30.116080 5184 projected.go:289] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 17:07:30 crc kubenswrapper[5184]: E0312 17:07:30.116105 5184 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 17:07:30 crc kubenswrapper[5184]: E0312 17:07:30.116175 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift podName:2ffae81b-589d-4502-a0a6-777b8d6f98b1 nodeName:}" failed. No retries permitted until 2026-03-12 17:07:30.616150426 +0000 UTC m=+993.157461775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift") pod "swift-storage-0" (UID: "2ffae81b-589d-4502-a0a6-777b8d6f98b1") : configmap "swift-ring-files" not found Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.117199 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2ffae81b-589d-4502-a0a6-777b8d6f98b1-lock\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.117544 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2ffae81b-589d-4502-a0a6-777b8d6f98b1-cache\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.125826 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffae81b-589d-4502-a0a6-777b8d6f98b1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.132632 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7lz2\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-kube-api-access-w7lz2\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.135004 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.477786 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nwctk"] Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.685625 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:30 crc kubenswrapper[5184]: E0312 17:07:30.685858 5184 projected.go:289] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 17:07:30 crc kubenswrapper[5184]: E0312 17:07:30.685870 5184 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 17:07:30 crc kubenswrapper[5184]: E0312 17:07:30.685922 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift podName:2ffae81b-589d-4502-a0a6-777b8d6f98b1 nodeName:}" failed. No retries permitted until 2026-03-12 17:07:31.685909022 +0000 UTC m=+994.227220361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift") pod "swift-storage-0" (UID: "2ffae81b-589d-4502-a0a6-777b8d6f98b1") : configmap "swift-ring-files" not found Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.696353 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.698612 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"swift-proxy-config-data\"" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.700534 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"swift-ring-scripts\"" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.700567 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"swift-ring-config-data\"" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.718970 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nwctk"] Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.797942 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0353bd4c-727d-4c46-8954-29b25872ba5a-etc-swift\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.798050 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njk6f\" (UniqueName: \"kubernetes.io/projected/0353bd4c-727d-4c46-8954-29b25872ba5a-kube-api-access-njk6f\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.798140 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-combined-ca-bundle\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.798161 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-scripts\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.798288 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-swiftconf\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.798501 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-dispersionconf\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.798551 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-ring-data-devices\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.818769 5184 generic.go:358] "Generic (PLEG): container finished" podID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerID="b99149c5397d7c2860370daf5a6ba8792284074fb31c5d02e3ebf1998450131b" exitCode=0 Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.818821 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" event={"ID":"6be46ed7-a6b6-4b6e-9934-3540b1867032","Type":"ContainerDied","Data":"b99149c5397d7c2860370daf5a6ba8792284074fb31c5d02e3ebf1998450131b"} Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.900185 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njk6f\" (UniqueName: \"kubernetes.io/projected/0353bd4c-727d-4c46-8954-29b25872ba5a-kube-api-access-njk6f\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.900240 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-combined-ca-bundle\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.900260 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-scripts\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.900694 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-swiftconf\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.900903 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-dispersionconf\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.900953 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-ring-data-devices\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.901126 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0353bd4c-727d-4c46-8954-29b25872ba5a-etc-swift\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.901563 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0353bd4c-727d-4c46-8954-29b25872ba5a-etc-swift\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.901606 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-scripts\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.901967 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-ring-data-devices\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.907668 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-combined-ca-bundle\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.911504 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-swiftconf\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.911549 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-dispersionconf\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:30 crc kubenswrapper[5184]: I0312 17:07:30.921779 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njk6f\" (UniqueName: \"kubernetes.io/projected/0353bd4c-727d-4c46-8954-29b25872ba5a-kube-api-access-njk6f\") pod \"swift-ring-rebalance-nwctk\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:31 crc kubenswrapper[5184]: I0312 17:07:31.028645 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:31 crc kubenswrapper[5184]: I0312 17:07:31.442333 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nwctk"] Mar 12 17:07:31 crc kubenswrapper[5184]: W0312 17:07:31.449118 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0353bd4c_727d_4c46_8954_29b25872ba5a.slice/crio-d00f481b6a403aa2e080e729d3852859fa546414df454941765f92068faef8ee WatchSource:0}: Error finding container d00f481b6a403aa2e080e729d3852859fa546414df454941765f92068faef8ee: Status 404 returned error can't find the container with id d00f481b6a403aa2e080e729d3852859fa546414df454941765f92068faef8ee Mar 12 17:07:31 crc kubenswrapper[5184]: I0312 17:07:31.712865 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:31 crc kubenswrapper[5184]: E0312 17:07:31.713167 5184 projected.go:289] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 17:07:31 crc kubenswrapper[5184]: E0312 17:07:31.713437 5184 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 17:07:31 crc kubenswrapper[5184]: E0312 17:07:31.713708 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift podName:2ffae81b-589d-4502-a0a6-777b8d6f98b1 nodeName:}" failed. No retries permitted until 2026-03-12 17:07:33.713669956 +0000 UTC m=+996.254981335 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift") pod "swift-storage-0" (UID: "2ffae81b-589d-4502-a0a6-777b8d6f98b1") : configmap "swift-ring-files" not found Mar 12 17:07:31 crc kubenswrapper[5184]: I0312 17:07:31.830923 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" event={"ID":"6be46ed7-a6b6-4b6e-9934-3540b1867032","Type":"ContainerStarted","Data":"83b2b1df399f5f8cf5a808c8135835d11fe2ce766d9e130f13711dea7a917a36"} Mar 12 17:07:31 crc kubenswrapper[5184]: I0312 17:07:31.831050 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:31 crc kubenswrapper[5184]: I0312 17:07:31.833159 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nwctk" event={"ID":"0353bd4c-727d-4c46-8954-29b25872ba5a","Type":"ContainerStarted","Data":"d00f481b6a403aa2e080e729d3852859fa546414df454941765f92068faef8ee"} Mar 12 17:07:31 crc kubenswrapper[5184]: I0312 17:07:31.852257 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" podStartSLOduration=3.852239818 podStartE2EDuration="3.852239818s" podCreationTimestamp="2026-03-12 17:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:07:31.849954946 +0000 UTC m=+994.391266305" watchObservedRunningTime="2026-03-12 17:07:31.852239818 +0000 UTC m=+994.393551147" Mar 12 17:07:32 crc kubenswrapper[5184]: I0312 17:07:32.907102 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-rxnwh"] Mar 12 17:07:32 crc kubenswrapper[5184]: I0312 17:07:32.962086 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-f7e4-account-create-update-jbqtf"] Mar 12 17:07:32 crc kubenswrapper[5184]: I0312 17:07:32.962326 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rxnwh" Mar 12 17:07:32 crc kubenswrapper[5184]: I0312 17:07:32.970833 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f7e4-account-create-update-jbqtf"] Mar 12 17:07:32 crc kubenswrapper[5184]: I0312 17:07:32.970905 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rxnwh"] Mar 12 17:07:32 crc kubenswrapper[5184]: I0312 17:07:32.971169 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f7e4-account-create-update-jbqtf" Mar 12 17:07:32 crc kubenswrapper[5184]: I0312 17:07:32.973308 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-db-secret\"" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.041795 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwsz2\" (UniqueName: \"kubernetes.io/projected/1b0585b6-5451-4f29-a11c-8d84143e3589-kube-api-access-qwsz2\") pod \"glance-f7e4-account-create-update-jbqtf\" (UID: \"1b0585b6-5451-4f29-a11c-8d84143e3589\") " pod="openstack/glance-f7e4-account-create-update-jbqtf" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.041871 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rjq2\" (UniqueName: \"kubernetes.io/projected/94bbecf2-f5e8-4513-a4c6-559d752aae55-kube-api-access-8rjq2\") pod \"glance-db-create-rxnwh\" (UID: \"94bbecf2-f5e8-4513-a4c6-559d752aae55\") " pod="openstack/glance-db-create-rxnwh" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.041932 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94bbecf2-f5e8-4513-a4c6-559d752aae55-operator-scripts\") pod \"glance-db-create-rxnwh\" (UID: \"94bbecf2-f5e8-4513-a4c6-559d752aae55\") " pod="openstack/glance-db-create-rxnwh" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.042365 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0585b6-5451-4f29-a11c-8d84143e3589-operator-scripts\") pod \"glance-f7e4-account-create-update-jbqtf\" (UID: \"1b0585b6-5451-4f29-a11c-8d84143e3589\") " pod="openstack/glance-f7e4-account-create-update-jbqtf" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.143777 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwsz2\" (UniqueName: \"kubernetes.io/projected/1b0585b6-5451-4f29-a11c-8d84143e3589-kube-api-access-qwsz2\") pod \"glance-f7e4-account-create-update-jbqtf\" (UID: \"1b0585b6-5451-4f29-a11c-8d84143e3589\") " pod="openstack/glance-f7e4-account-create-update-jbqtf" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.143845 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rjq2\" (UniqueName: \"kubernetes.io/projected/94bbecf2-f5e8-4513-a4c6-559d752aae55-kube-api-access-8rjq2\") pod \"glance-db-create-rxnwh\" (UID: \"94bbecf2-f5e8-4513-a4c6-559d752aae55\") " pod="openstack/glance-db-create-rxnwh" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.144195 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94bbecf2-f5e8-4513-a4c6-559d752aae55-operator-scripts\") pod \"glance-db-create-rxnwh\" (UID: \"94bbecf2-f5e8-4513-a4c6-559d752aae55\") " pod="openstack/glance-db-create-rxnwh" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.144464 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0585b6-5451-4f29-a11c-8d84143e3589-operator-scripts\") pod \"glance-f7e4-account-create-update-jbqtf\" (UID: \"1b0585b6-5451-4f29-a11c-8d84143e3589\") " pod="openstack/glance-f7e4-account-create-update-jbqtf" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.145512 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94bbecf2-f5e8-4513-a4c6-559d752aae55-operator-scripts\") pod \"glance-db-create-rxnwh\" (UID: \"94bbecf2-f5e8-4513-a4c6-559d752aae55\") " pod="openstack/glance-db-create-rxnwh" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.145588 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0585b6-5451-4f29-a11c-8d84143e3589-operator-scripts\") pod \"glance-f7e4-account-create-update-jbqtf\" (UID: \"1b0585b6-5451-4f29-a11c-8d84143e3589\") " pod="openstack/glance-f7e4-account-create-update-jbqtf" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.163269 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rjq2\" (UniqueName: \"kubernetes.io/projected/94bbecf2-f5e8-4513-a4c6-559d752aae55-kube-api-access-8rjq2\") pod \"glance-db-create-rxnwh\" (UID: \"94bbecf2-f5e8-4513-a4c6-559d752aae55\") " pod="openstack/glance-db-create-rxnwh" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.168441 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwsz2\" (UniqueName: \"kubernetes.io/projected/1b0585b6-5451-4f29-a11c-8d84143e3589-kube-api-access-qwsz2\") pod \"glance-f7e4-account-create-update-jbqtf\" (UID: \"1b0585b6-5451-4f29-a11c-8d84143e3589\") " pod="openstack/glance-f7e4-account-create-update-jbqtf" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.295659 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rxnwh" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.304316 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f7e4-account-create-update-jbqtf" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.639553 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wn6cv"] Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.660623 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wn6cv"] Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.660749 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wn6cv" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.662952 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-mariadb-root-db-secret\"" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.757590 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb057e73-a8a2-44b3-b709-54e14d97daef-operator-scripts\") pod \"root-account-create-update-wn6cv\" (UID: \"fb057e73-a8a2-44b3-b709-54e14d97daef\") " pod="openstack/root-account-create-update-wn6cv" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.758040 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqbxp\" (UniqueName: \"kubernetes.io/projected/fb057e73-a8a2-44b3-b709-54e14d97daef-kube-api-access-wqbxp\") pod \"root-account-create-update-wn6cv\" (UID: \"fb057e73-a8a2-44b3-b709-54e14d97daef\") " pod="openstack/root-account-create-update-wn6cv" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.758175 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:33 crc kubenswrapper[5184]: E0312 17:07:33.758342 5184 projected.go:289] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 17:07:33 crc kubenswrapper[5184]: E0312 17:07:33.758365 5184 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 17:07:33 crc kubenswrapper[5184]: E0312 17:07:33.758466 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift podName:2ffae81b-589d-4502-a0a6-777b8d6f98b1 nodeName:}" failed. No retries permitted until 2026-03-12 17:07:37.758447278 +0000 UTC m=+1000.299758617 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift") pod "swift-storage-0" (UID: "2ffae81b-589d-4502-a0a6-777b8d6f98b1") : configmap "swift-ring-files" not found Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.859278 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb057e73-a8a2-44b3-b709-54e14d97daef-operator-scripts\") pod \"root-account-create-update-wn6cv\" (UID: \"fb057e73-a8a2-44b3-b709-54e14d97daef\") " pod="openstack/root-account-create-update-wn6cv" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.859403 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqbxp\" (UniqueName: \"kubernetes.io/projected/fb057e73-a8a2-44b3-b709-54e14d97daef-kube-api-access-wqbxp\") pod \"root-account-create-update-wn6cv\" (UID: \"fb057e73-a8a2-44b3-b709-54e14d97daef\") " pod="openstack/root-account-create-update-wn6cv" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.860829 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb057e73-a8a2-44b3-b709-54e14d97daef-operator-scripts\") pod \"root-account-create-update-wn6cv\" (UID: \"fb057e73-a8a2-44b3-b709-54e14d97daef\") " pod="openstack/root-account-create-update-wn6cv" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.878119 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqbxp\" (UniqueName: \"kubernetes.io/projected/fb057e73-a8a2-44b3-b709-54e14d97daef-kube-api-access-wqbxp\") pod \"root-account-create-update-wn6cv\" (UID: \"fb057e73-a8a2-44b3-b709-54e14d97daef\") " pod="openstack/root-account-create-update-wn6cv" Mar 12 17:07:33 crc kubenswrapper[5184]: I0312 17:07:33.977525 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wn6cv" Mar 12 17:07:34 crc kubenswrapper[5184]: I0312 17:07:34.856925 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nwctk" event={"ID":"0353bd4c-727d-4c46-8954-29b25872ba5a","Type":"ContainerStarted","Data":"2799d7016e0cf865a8dea115fe19bd83b26ddae4191f174bc063251d7e9cdb7c"} Mar 12 17:07:34 crc kubenswrapper[5184]: I0312 17:07:34.876651 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-nwctk" podStartSLOduration=1.755352508 podStartE2EDuration="4.876626988s" podCreationTimestamp="2026-03-12 17:07:30 +0000 UTC" firstStartedPulling="2026-03-12 17:07:31.452290097 +0000 UTC m=+993.993601436" lastFinishedPulling="2026-03-12 17:07:34.573564547 +0000 UTC m=+997.114875916" observedRunningTime="2026-03-12 17:07:34.873116828 +0000 UTC m=+997.414428167" watchObservedRunningTime="2026-03-12 17:07:34.876626988 +0000 UTC m=+997.417938327" Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.017487 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rxnwh"] Mar 12 17:07:35 crc kubenswrapper[5184]: W0312 17:07:35.022743 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94bbecf2_f5e8_4513_a4c6_559d752aae55.slice/crio-58e2b0fcf485ca097777cc574cd7b08a6d3006c4e91106a5f90def82ba38a550 WatchSource:0}: Error finding container 58e2b0fcf485ca097777cc574cd7b08a6d3006c4e91106a5f90def82ba38a550: Status 404 returned error can't find the container with id 58e2b0fcf485ca097777cc574cd7b08a6d3006c4e91106a5f90def82ba38a550 Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.098098 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wn6cv"] Mar 12 17:07:35 crc kubenswrapper[5184]: W0312 17:07:35.111878 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb057e73_a8a2_44b3_b709_54e14d97daef.slice/crio-255d729147664da47150ee98c6012f8d668753a3cf9804b125b689562cb66e51 WatchSource:0}: Error finding container 255d729147664da47150ee98c6012f8d668753a3cf9804b125b689562cb66e51: Status 404 returned error can't find the container with id 255d729147664da47150ee98c6012f8d668753a3cf9804b125b689562cb66e51 Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.186518 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f7e4-account-create-update-jbqtf"] Mar 12 17:07:35 crc kubenswrapper[5184]: W0312 17:07:35.206724 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b0585b6_5451_4f29_a11c_8d84143e3589.slice/crio-0e9f878968e5b87aa311562dd7fe1763d89ea5c91f3d7af167f1e9cf4b3441de WatchSource:0}: Error finding container 0e9f878968e5b87aa311562dd7fe1763d89ea5c91f3d7af167f1e9cf4b3441de: Status 404 returned error can't find the container with id 0e9f878968e5b87aa311562dd7fe1763d89ea5c91f3d7af167f1e9cf4b3441de Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.865744 5184 generic.go:358] "Generic (PLEG): container finished" podID="1b0585b6-5451-4f29-a11c-8d84143e3589" containerID="592c57124a6c246a80513dfee9774043582d86990464af951cfc7d318b698869" exitCode=0 Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.865869 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f7e4-account-create-update-jbqtf" event={"ID":"1b0585b6-5451-4f29-a11c-8d84143e3589","Type":"ContainerDied","Data":"592c57124a6c246a80513dfee9774043582d86990464af951cfc7d318b698869"} Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.865919 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f7e4-account-create-update-jbqtf" event={"ID":"1b0585b6-5451-4f29-a11c-8d84143e3589","Type":"ContainerStarted","Data":"0e9f878968e5b87aa311562dd7fe1763d89ea5c91f3d7af167f1e9cf4b3441de"} Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.868014 5184 generic.go:358] "Generic (PLEG): container finished" podID="94bbecf2-f5e8-4513-a4c6-559d752aae55" containerID="a3c4cea2a60bc9ef16549c18cb1981decd4a633a723252ed8e1e69562d803b6c" exitCode=0 Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.868137 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rxnwh" event={"ID":"94bbecf2-f5e8-4513-a4c6-559d752aae55","Type":"ContainerDied","Data":"a3c4cea2a60bc9ef16549c18cb1981decd4a633a723252ed8e1e69562d803b6c"} Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.868207 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rxnwh" event={"ID":"94bbecf2-f5e8-4513-a4c6-559d752aae55","Type":"ContainerStarted","Data":"58e2b0fcf485ca097777cc574cd7b08a6d3006c4e91106a5f90def82ba38a550"} Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.870329 5184 generic.go:358] "Generic (PLEG): container finished" podID="fb057e73-a8a2-44b3-b709-54e14d97daef" containerID="f41b5c6efce43c3a2aadec9482b88331de4149618247f6d48203d9b65bdccfad" exitCode=0 Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.870419 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wn6cv" event={"ID":"fb057e73-a8a2-44b3-b709-54e14d97daef","Type":"ContainerDied","Data":"f41b5c6efce43c3a2aadec9482b88331de4149618247f6d48203d9b65bdccfad"} Mar 12 17:07:35 crc kubenswrapper[5184]: I0312 17:07:35.870447 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wn6cv" event={"ID":"fb057e73-a8a2-44b3-b709-54e14d97daef","Type":"ContainerStarted","Data":"255d729147664da47150ee98c6012f8d668753a3cf9804b125b689562cb66e51"} Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.379061 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f7e4-account-create-update-jbqtf" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.435859 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0585b6-5451-4f29-a11c-8d84143e3589-operator-scripts\") pod \"1b0585b6-5451-4f29-a11c-8d84143e3589\" (UID: \"1b0585b6-5451-4f29-a11c-8d84143e3589\") " Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.436098 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwsz2\" (UniqueName: \"kubernetes.io/projected/1b0585b6-5451-4f29-a11c-8d84143e3589-kube-api-access-qwsz2\") pod \"1b0585b6-5451-4f29-a11c-8d84143e3589\" (UID: \"1b0585b6-5451-4f29-a11c-8d84143e3589\") " Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.437432 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b0585b6-5451-4f29-a11c-8d84143e3589-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b0585b6-5451-4f29-a11c-8d84143e3589" (UID: "1b0585b6-5451-4f29-a11c-8d84143e3589"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.442270 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0585b6-5451-4f29-a11c-8d84143e3589-kube-api-access-qwsz2" (OuterVolumeSpecName: "kube-api-access-qwsz2") pod "1b0585b6-5451-4f29-a11c-8d84143e3589" (UID: "1b0585b6-5451-4f29-a11c-8d84143e3589"). InnerVolumeSpecName "kube-api-access-qwsz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.484369 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rxnwh" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.490830 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wn6cv" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.537996 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94bbecf2-f5e8-4513-a4c6-559d752aae55-operator-scripts\") pod \"94bbecf2-f5e8-4513-a4c6-559d752aae55\" (UID: \"94bbecf2-f5e8-4513-a4c6-559d752aae55\") " Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.538403 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94bbecf2-f5e8-4513-a4c6-559d752aae55-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94bbecf2-f5e8-4513-a4c6-559d752aae55" (UID: "94bbecf2-f5e8-4513-a4c6-559d752aae55"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.538505 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rjq2\" (UniqueName: \"kubernetes.io/projected/94bbecf2-f5e8-4513-a4c6-559d752aae55-kube-api-access-8rjq2\") pod \"94bbecf2-f5e8-4513-a4c6-559d752aae55\" (UID: \"94bbecf2-f5e8-4513-a4c6-559d752aae55\") " Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.539137 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwsz2\" (UniqueName: \"kubernetes.io/projected/1b0585b6-5451-4f29-a11c-8d84143e3589-kube-api-access-qwsz2\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.539221 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94bbecf2-f5e8-4513-a4c6-559d752aae55-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.539314 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b0585b6-5451-4f29-a11c-8d84143e3589-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.545678 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94bbecf2-f5e8-4513-a4c6-559d752aae55-kube-api-access-8rjq2" (OuterVolumeSpecName: "kube-api-access-8rjq2") pod "94bbecf2-f5e8-4513-a4c6-559d752aae55" (UID: "94bbecf2-f5e8-4513-a4c6-559d752aae55"). InnerVolumeSpecName "kube-api-access-8rjq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.639652 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-lf25b"] Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.640008 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb057e73-a8a2-44b3-b709-54e14d97daef-operator-scripts\") pod \"fb057e73-a8a2-44b3-b709-54e14d97daef\" (UID: \"fb057e73-a8a2-44b3-b709-54e14d97daef\") " Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.640477 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqbxp\" (UniqueName: \"kubernetes.io/projected/fb057e73-a8a2-44b3-b709-54e14d97daef-kube-api-access-wqbxp\") pod \"fb057e73-a8a2-44b3-b709-54e14d97daef\" (UID: \"fb057e73-a8a2-44b3-b709-54e14d97daef\") " Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.640528 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb057e73-a8a2-44b3-b709-54e14d97daef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb057e73-a8a2-44b3-b709-54e14d97daef" (UID: "fb057e73-a8a2-44b3-b709-54e14d97daef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.640494 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b0585b6-5451-4f29-a11c-8d84143e3589" containerName="mariadb-account-create-update" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.640647 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0585b6-5451-4f29-a11c-8d84143e3589" containerName="mariadb-account-create-update" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.640730 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94bbecf2-f5e8-4513-a4c6-559d752aae55" containerName="mariadb-database-create" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.640740 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="94bbecf2-f5e8-4513-a4c6-559d752aae55" containerName="mariadb-database-create" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.640754 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb057e73-a8a2-44b3-b709-54e14d97daef" containerName="mariadb-account-create-update" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.640759 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb057e73-a8a2-44b3-b709-54e14d97daef" containerName="mariadb-account-create-update" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.641016 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="94bbecf2-f5e8-4513-a4c6-559d752aae55" containerName="mariadb-database-create" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.641025 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8rjq2\" (UniqueName: \"kubernetes.io/projected/94bbecf2-f5e8-4513-a4c6-559d752aae55-kube-api-access-8rjq2\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.641032 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b0585b6-5451-4f29-a11c-8d84143e3589" containerName="mariadb-account-create-update" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.641050 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb057e73-a8a2-44b3-b709-54e14d97daef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.641062 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb057e73-a8a2-44b3-b709-54e14d97daef" containerName="mariadb-account-create-update" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.648446 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb057e73-a8a2-44b3-b709-54e14d97daef-kube-api-access-wqbxp" (OuterVolumeSpecName: "kube-api-access-wqbxp") pod "fb057e73-a8a2-44b3-b709-54e14d97daef" (UID: "fb057e73-a8a2-44b3-b709-54e14d97daef"). InnerVolumeSpecName "kube-api-access-wqbxp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.651161 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lf25b" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.656615 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lf25b"] Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.738161 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-2da3-account-create-update-jbb94"] Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.742493 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fzf9\" (UniqueName: \"kubernetes.io/projected/041f7579-fdb5-43db-9291-318597c8c028-kube-api-access-4fzf9\") pod \"keystone-db-create-lf25b\" (UID: \"041f7579-fdb5-43db-9291-318597c8c028\") " pod="openstack/keystone-db-create-lf25b" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.742697 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/041f7579-fdb5-43db-9291-318597c8c028-operator-scripts\") pod \"keystone-db-create-lf25b\" (UID: \"041f7579-fdb5-43db-9291-318597c8c028\") " pod="openstack/keystone-db-create-lf25b" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.743014 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqbxp\" (UniqueName: \"kubernetes.io/projected/fb057e73-a8a2-44b3-b709-54e14d97daef-kube-api-access-wqbxp\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.743274 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2da3-account-create-update-jbb94" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.746402 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-db-secret\"" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.748093 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2da3-account-create-update-jbb94"] Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.842487 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.844595 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fzf9\" (UniqueName: \"kubernetes.io/projected/041f7579-fdb5-43db-9291-318597c8c028-kube-api-access-4fzf9\") pod \"keystone-db-create-lf25b\" (UID: \"041f7579-fdb5-43db-9291-318597c8c028\") " pod="openstack/keystone-db-create-lf25b" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.844742 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/041f7579-fdb5-43db-9291-318597c8c028-operator-scripts\") pod \"keystone-db-create-lf25b\" (UID: \"041f7579-fdb5-43db-9291-318597c8c028\") " pod="openstack/keystone-db-create-lf25b" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.844871 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.844990 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bae2737b-02b8-46f4-9762-842b44b6b506-operator-scripts\") pod \"keystone-2da3-account-create-update-jbb94\" (UID: \"bae2737b-02b8-46f4-9762-842b44b6b506\") " pod="openstack/keystone-2da3-account-create-update-jbb94" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.845029 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2hb4\" (UniqueName: \"kubernetes.io/projected/bae2737b-02b8-46f4-9762-842b44b6b506-kube-api-access-g2hb4\") pod \"keystone-2da3-account-create-update-jbb94\" (UID: \"bae2737b-02b8-46f4-9762-842b44b6b506\") " pod="openstack/keystone-2da3-account-create-update-jbb94" Mar 12 17:07:37 crc kubenswrapper[5184]: E0312 17:07:37.845149 5184 projected.go:289] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 17:07:37 crc kubenswrapper[5184]: E0312 17:07:37.845188 5184 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 17:07:37 crc kubenswrapper[5184]: E0312 17:07:37.845286 5184 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift podName:2ffae81b-589d-4502-a0a6-777b8d6f98b1 nodeName:}" failed. No retries permitted until 2026-03-12 17:07:45.845262534 +0000 UTC m=+1008.386573873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift") pod "swift-storage-0" (UID: "2ffae81b-589d-4502-a0a6-777b8d6f98b1") : configmap "swift-ring-files" not found Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.845505 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/041f7579-fdb5-43db-9291-318597c8c028-operator-scripts\") pod \"keystone-db-create-lf25b\" (UID: \"041f7579-fdb5-43db-9291-318597c8c028\") " pod="openstack/keystone-db-create-lf25b" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.866158 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fzf9\" (UniqueName: \"kubernetes.io/projected/041f7579-fdb5-43db-9291-318597c8c028-kube-api-access-4fzf9\") pod \"keystone-db-create-lf25b\" (UID: \"041f7579-fdb5-43db-9291-318597c8c028\") " pod="openstack/keystone-db-create-lf25b" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.893024 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rxnwh" event={"ID":"94bbecf2-f5e8-4513-a4c6-559d752aae55","Type":"ContainerDied","Data":"58e2b0fcf485ca097777cc574cd7b08a6d3006c4e91106a5f90def82ba38a550"} Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.893092 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e2b0fcf485ca097777cc574cd7b08a6d3006c4e91106a5f90def82ba38a550" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.893040 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rxnwh" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.895269 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wn6cv" event={"ID":"fb057e73-a8a2-44b3-b709-54e14d97daef","Type":"ContainerDied","Data":"255d729147664da47150ee98c6012f8d668753a3cf9804b125b689562cb66e51"} Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.895302 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wn6cv" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.895317 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="255d729147664da47150ee98c6012f8d668753a3cf9804b125b689562cb66e51" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.899237 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59bc98f85f-g4qsm"] Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.899631 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" podUID="36632b13-d3b1-4c31-864c-c6f0d31cb057" containerName="dnsmasq-dns" containerID="cri-o://aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd" gracePeriod=10 Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.903578 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f7e4-account-create-update-jbqtf" event={"ID":"1b0585b6-5451-4f29-a11c-8d84143e3589","Type":"ContainerDied","Data":"0e9f878968e5b87aa311562dd7fe1763d89ea5c91f3d7af167f1e9cf4b3441de"} Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.903624 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e9f878968e5b87aa311562dd7fe1763d89ea5c91f3d7af167f1e9cf4b3441de" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.903592 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f7e4-account-create-update-jbqtf" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.946828 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bae2737b-02b8-46f4-9762-842b44b6b506-operator-scripts\") pod \"keystone-2da3-account-create-update-jbb94\" (UID: \"bae2737b-02b8-46f4-9762-842b44b6b506\") " pod="openstack/keystone-2da3-account-create-update-jbb94" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.946862 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2hb4\" (UniqueName: \"kubernetes.io/projected/bae2737b-02b8-46f4-9762-842b44b6b506-kube-api-access-g2hb4\") pod \"keystone-2da3-account-create-update-jbb94\" (UID: \"bae2737b-02b8-46f4-9762-842b44b6b506\") " pod="openstack/keystone-2da3-account-create-update-jbb94" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.947632 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bae2737b-02b8-46f4-9762-842b44b6b506-operator-scripts\") pod \"keystone-2da3-account-create-update-jbb94\" (UID: \"bae2737b-02b8-46f4-9762-842b44b6b506\") " pod="openstack/keystone-2da3-account-create-update-jbb94" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.961075 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-sqldm"] Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.972963 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2hb4\" (UniqueName: \"kubernetes.io/projected/bae2737b-02b8-46f4-9762-842b44b6b506-kube-api-access-g2hb4\") pod \"keystone-2da3-account-create-update-jbb94\" (UID: \"bae2737b-02b8-46f4-9762-842b44b6b506\") " pod="openstack/keystone-2da3-account-create-update-jbb94" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.980609 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lf25b" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.981208 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-sqldm"] Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.981317 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sqldm" Mar 12 17:07:37 crc kubenswrapper[5184]: I0312 17:07:37.992265 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-c88c-account-create-update-zlssx"] Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.005870 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c88c-account-create-update-zlssx" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.008973 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c88c-account-create-update-zlssx"] Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.009209 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-db-secret\"" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.049199 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdjp6\" (UniqueName: \"kubernetes.io/projected/afe07d76-6af2-408e-a77d-45434eaa4eb3-kube-api-access-fdjp6\") pod \"placement-db-create-sqldm\" (UID: \"afe07d76-6af2-408e-a77d-45434eaa4eb3\") " pod="openstack/placement-db-create-sqldm" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.049351 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afe07d76-6af2-408e-a77d-45434eaa4eb3-operator-scripts\") pod \"placement-db-create-sqldm\" (UID: \"afe07d76-6af2-408e-a77d-45434eaa4eb3\") " pod="openstack/placement-db-create-sqldm" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.068745 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2da3-account-create-update-jbb94" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.150888 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdjp6\" (UniqueName: \"kubernetes.io/projected/afe07d76-6af2-408e-a77d-45434eaa4eb3-kube-api-access-fdjp6\") pod \"placement-db-create-sqldm\" (UID: \"afe07d76-6af2-408e-a77d-45434eaa4eb3\") " pod="openstack/placement-db-create-sqldm" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.151319 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmwfw\" (UniqueName: \"kubernetes.io/projected/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-kube-api-access-tmwfw\") pod \"placement-c88c-account-create-update-zlssx\" (UID: \"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b\") " pod="openstack/placement-c88c-account-create-update-zlssx" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.151486 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-operator-scripts\") pod \"placement-c88c-account-create-update-zlssx\" (UID: \"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b\") " pod="openstack/placement-c88c-account-create-update-zlssx" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.151538 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afe07d76-6af2-408e-a77d-45434eaa4eb3-operator-scripts\") pod \"placement-db-create-sqldm\" (UID: \"afe07d76-6af2-408e-a77d-45434eaa4eb3\") " pod="openstack/placement-db-create-sqldm" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.152388 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afe07d76-6af2-408e-a77d-45434eaa4eb3-operator-scripts\") pod \"placement-db-create-sqldm\" (UID: \"afe07d76-6af2-408e-a77d-45434eaa4eb3\") " pod="openstack/placement-db-create-sqldm" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.172039 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdjp6\" (UniqueName: \"kubernetes.io/projected/afe07d76-6af2-408e-a77d-45434eaa4eb3-kube-api-access-fdjp6\") pod \"placement-db-create-sqldm\" (UID: \"afe07d76-6af2-408e-a77d-45434eaa4eb3\") " pod="openstack/placement-db-create-sqldm" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.252929 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-operator-scripts\") pod \"placement-c88c-account-create-update-zlssx\" (UID: \"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b\") " pod="openstack/placement-c88c-account-create-update-zlssx" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.253160 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmwfw\" (UniqueName: \"kubernetes.io/projected/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-kube-api-access-tmwfw\") pod \"placement-c88c-account-create-update-zlssx\" (UID: \"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b\") " pod="openstack/placement-c88c-account-create-update-zlssx" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.253804 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-operator-scripts\") pod \"placement-c88c-account-create-update-zlssx\" (UID: \"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b\") " pod="openstack/placement-c88c-account-create-update-zlssx" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.272616 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmwfw\" (UniqueName: \"kubernetes.io/projected/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-kube-api-access-tmwfw\") pod \"placement-c88c-account-create-update-zlssx\" (UID: \"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b\") " pod="openstack/placement-c88c-account-create-update-zlssx" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.340745 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sqldm" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.350703 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c88c-account-create-update-zlssx" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.602707 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lf25b"] Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.620211 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.702114 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2da3-account-create-update-jbb94"] Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.751421 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzk77\" (UniqueName: \"kubernetes.io/projected/36632b13-d3b1-4c31-864c-c6f0d31cb057-kube-api-access-rzk77\") pod \"36632b13-d3b1-4c31-864c-c6f0d31cb057\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.751487 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-config\") pod \"36632b13-d3b1-4c31-864c-c6f0d31cb057\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.751560 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-nb\") pod \"36632b13-d3b1-4c31-864c-c6f0d31cb057\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.751631 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-dns-svc\") pod \"36632b13-d3b1-4c31-864c-c6f0d31cb057\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.751716 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-sb\") pod \"36632b13-d3b1-4c31-864c-c6f0d31cb057\" (UID: \"36632b13-d3b1-4c31-864c-c6f0d31cb057\") " Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.762023 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36632b13-d3b1-4c31-864c-c6f0d31cb057-kube-api-access-rzk77" (OuterVolumeSpecName: "kube-api-access-rzk77") pod "36632b13-d3b1-4c31-864c-c6f0d31cb057" (UID: "36632b13-d3b1-4c31-864c-c6f0d31cb057"). InnerVolumeSpecName "kube-api-access-rzk77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.808782 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-config" (OuterVolumeSpecName: "config") pod "36632b13-d3b1-4c31-864c-c6f0d31cb057" (UID: "36632b13-d3b1-4c31-864c-c6f0d31cb057"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.810022 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36632b13-d3b1-4c31-864c-c6f0d31cb057" (UID: "36632b13-d3b1-4c31-864c-c6f0d31cb057"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.820369 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "36632b13-d3b1-4c31-864c-c6f0d31cb057" (UID: "36632b13-d3b1-4c31-864c-c6f0d31cb057"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.844414 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "36632b13-d3b1-4c31-864c-c6f0d31cb057" (UID: "36632b13-d3b1-4c31-864c-c6f0d31cb057"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.854339 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.854369 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rzk77\" (UniqueName: \"kubernetes.io/projected/36632b13-d3b1-4c31-864c-c6f0d31cb057-kube-api-access-rzk77\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.854394 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.854402 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.854411 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36632b13-d3b1-4c31-864c-c6f0d31cb057-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.855119 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.915890 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2da3-account-create-update-jbb94" event={"ID":"bae2737b-02b8-46f4-9762-842b44b6b506","Type":"ContainerStarted","Data":"7f3ba97631c02718650f117fbe9363093763eab313e7e0abec7245b2f9967333"} Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.915933 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2da3-account-create-update-jbb94" event={"ID":"bae2737b-02b8-46f4-9762-842b44b6b506","Type":"ContainerStarted","Data":"6a1f7e0234e8b0352e945c1e2832ebdd584262f503722add25c607d1b2657114"} Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.920187 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lf25b" event={"ID":"041f7579-fdb5-43db-9291-318597c8c028","Type":"ContainerStarted","Data":"f9dc3723272109ce6bcc336607537943677053bf8a8bf3eea7f134b8376919f0"} Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.920217 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lf25b" event={"ID":"041f7579-fdb5-43db-9291-318597c8c028","Type":"ContainerStarted","Data":"898500b638f2a4199d7a84c50f4dc676e1f0c833d9eeb87adae1e8f7c43b8064"} Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.921390 5184 generic.go:358] "Generic (PLEG): container finished" podID="36632b13-d3b1-4c31-864c-c6f0d31cb057" containerID="aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd" exitCode=0 Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.921512 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" event={"ID":"36632b13-d3b1-4c31-864c-c6f0d31cb057","Type":"ContainerDied","Data":"aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd"} Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.921536 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" event={"ID":"36632b13-d3b1-4c31-864c-c6f0d31cb057","Type":"ContainerDied","Data":"4e1bbee7ea00f2c249b02abd4029ae9ec8fb5be4a223422fdccfe243eb0bdec8"} Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.921553 5184 scope.go:117] "RemoveContainer" containerID="aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.921655 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59bc98f85f-g4qsm" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.930195 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-2da3-account-create-update-jbb94" podStartSLOduration=1.930181537 podStartE2EDuration="1.930181537s" podCreationTimestamp="2026-03-12 17:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:07:38.927223844 +0000 UTC m=+1001.468535183" watchObservedRunningTime="2026-03-12 17:07:38.930181537 +0000 UTC m=+1001.471492866" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.957844 5184 scope.go:117] "RemoveContainer" containerID="84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.958248 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-lf25b" podStartSLOduration=1.95823115 podStartE2EDuration="1.95823115s" podCreationTimestamp="2026-03-12 17:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:07:38.946343586 +0000 UTC m=+1001.487654945" watchObservedRunningTime="2026-03-12 17:07:38.95823115 +0000 UTC m=+1001.499542489" Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.973570 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59bc98f85f-g4qsm"] Mar 12 17:07:38 crc kubenswrapper[5184]: I0312 17:07:38.990537 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59bc98f85f-g4qsm"] Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.000043 5184 scope.go:117] "RemoveContainer" containerID="aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd" Mar 12 17:07:39 crc kubenswrapper[5184]: E0312 17:07:39.000541 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd\": container with ID starting with aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd not found: ID does not exist" containerID="aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd" Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.000586 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd"} err="failed to get container status \"aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd\": rpc error: code = NotFound desc = could not find container \"aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd\": container with ID starting with aef45cb057edf0fa8580f8416f9c96968ac3aabae1d01ef2630e1f91c9cff8bd not found: ID does not exist" Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.000615 5184 scope.go:117] "RemoveContainer" containerID="84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6" Mar 12 17:07:39 crc kubenswrapper[5184]: E0312 17:07:39.001090 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6\": container with ID starting with 84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6 not found: ID does not exist" containerID="84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6" Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.001122 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6"} err="failed to get container status \"84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6\": rpc error: code = NotFound desc = could not find container \"84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6\": container with ID starting with 84e23f44a7e6a34d784b835173fc27b8f3b21ed3f32e23ed869f65d599a912d6 not found: ID does not exist" Mar 12 17:07:39 crc kubenswrapper[5184]: W0312 17:07:39.004976 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafe07d76_6af2_408e_a77d_45434eaa4eb3.slice/crio-cfe8ce4f750fdff8bf2ea14dfa072cdb474445f1f35347ca3e859e3ddf352ae8 WatchSource:0}: Error finding container cfe8ce4f750fdff8bf2ea14dfa072cdb474445f1f35347ca3e859e3ddf352ae8: Status 404 returned error can't find the container with id cfe8ce4f750fdff8bf2ea14dfa072cdb474445f1f35347ca3e859e3ddf352ae8 Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.009826 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-sqldm"] Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.071358 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c88c-account-create-update-zlssx"] Mar 12 17:07:39 crc kubenswrapper[5184]: W0312 17:07:39.085226 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd5d12b8_0fdd_4998_8dbd_8df30df4af5b.slice/crio-3213f8be5a699690beb762e0d894855751da12abfc9d875f2a3a33a10346f0a8 WatchSource:0}: Error finding container 3213f8be5a699690beb762e0d894855751da12abfc9d875f2a3a33a10346f0a8: Status 404 returned error can't find the container with id 3213f8be5a699690beb762e0d894855751da12abfc9d875f2a3a33a10346f0a8 Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.930204 5184 generic.go:358] "Generic (PLEG): container finished" podID="afe07d76-6af2-408e-a77d-45434eaa4eb3" containerID="cbaa56a359b46b638e5969f84cd63453e5da865d262bd0a4bdb74e47936991a9" exitCode=0 Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.931121 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sqldm" event={"ID":"afe07d76-6af2-408e-a77d-45434eaa4eb3","Type":"ContainerDied","Data":"cbaa56a359b46b638e5969f84cd63453e5da865d262bd0a4bdb74e47936991a9"} Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.931152 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sqldm" event={"ID":"afe07d76-6af2-408e-a77d-45434eaa4eb3","Type":"ContainerStarted","Data":"cfe8ce4f750fdff8bf2ea14dfa072cdb474445f1f35347ca3e859e3ddf352ae8"} Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.934871 5184 generic.go:358] "Generic (PLEG): container finished" podID="bae2737b-02b8-46f4-9762-842b44b6b506" containerID="7f3ba97631c02718650f117fbe9363093763eab313e7e0abec7245b2f9967333" exitCode=0 Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.935023 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2da3-account-create-update-jbb94" event={"ID":"bae2737b-02b8-46f4-9762-842b44b6b506","Type":"ContainerDied","Data":"7f3ba97631c02718650f117fbe9363093763eab313e7e0abec7245b2f9967333"} Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.936928 5184 generic.go:358] "Generic (PLEG): container finished" podID="041f7579-fdb5-43db-9291-318597c8c028" containerID="f9dc3723272109ce6bcc336607537943677053bf8a8bf3eea7f134b8376919f0" exitCode=0 Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.937056 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lf25b" event={"ID":"041f7579-fdb5-43db-9291-318597c8c028","Type":"ContainerDied","Data":"f9dc3723272109ce6bcc336607537943677053bf8a8bf3eea7f134b8376919f0"} Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.949676 5184 generic.go:358] "Generic (PLEG): container finished" podID="bd5d12b8-0fdd-4998-8dbd-8df30df4af5b" containerID="4c88d94c3fe4a0721c9dcd52752ba963d089a7f6e91dc5443cb1da06f3d62a22" exitCode=0 Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.949746 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c88c-account-create-update-zlssx" event={"ID":"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b","Type":"ContainerDied","Data":"4c88d94c3fe4a0721c9dcd52752ba963d089a7f6e91dc5443cb1da06f3d62a22"} Mar 12 17:07:39 crc kubenswrapper[5184]: I0312 17:07:39.949770 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c88c-account-create-update-zlssx" event={"ID":"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b","Type":"ContainerStarted","Data":"3213f8be5a699690beb762e0d894855751da12abfc9d875f2a3a33a10346f0a8"} Mar 12 17:07:40 crc kubenswrapper[5184]: I0312 17:07:40.095324 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wn6cv"] Mar 12 17:07:40 crc kubenswrapper[5184]: I0312 17:07:40.109035 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wn6cv"] Mar 12 17:07:40 crc kubenswrapper[5184]: I0312 17:07:40.408494 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36632b13-d3b1-4c31-864c-c6f0d31cb057" path="/var/lib/kubelet/pods/36632b13-d3b1-4c31-864c-c6f0d31cb057/volumes" Mar 12 17:07:40 crc kubenswrapper[5184]: I0312 17:07:40.409210 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb057e73-a8a2-44b3-b709-54e14d97daef" path="/var/lib/kubelet/pods/fb057e73-a8a2-44b3-b709-54e14d97daef/volumes" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.461018 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sqldm" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.505053 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afe07d76-6af2-408e-a77d-45434eaa4eb3-operator-scripts\") pod \"afe07d76-6af2-408e-a77d-45434eaa4eb3\" (UID: \"afe07d76-6af2-408e-a77d-45434eaa4eb3\") " Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.505146 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdjp6\" (UniqueName: \"kubernetes.io/projected/afe07d76-6af2-408e-a77d-45434eaa4eb3-kube-api-access-fdjp6\") pod \"afe07d76-6af2-408e-a77d-45434eaa4eb3\" (UID: \"afe07d76-6af2-408e-a77d-45434eaa4eb3\") " Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.506514 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afe07d76-6af2-408e-a77d-45434eaa4eb3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afe07d76-6af2-408e-a77d-45434eaa4eb3" (UID: "afe07d76-6af2-408e-a77d-45434eaa4eb3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.512204 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe07d76-6af2-408e-a77d-45434eaa4eb3-kube-api-access-fdjp6" (OuterVolumeSpecName: "kube-api-access-fdjp6") pod "afe07d76-6af2-408e-a77d-45434eaa4eb3" (UID: "afe07d76-6af2-408e-a77d-45434eaa4eb3"). InnerVolumeSpecName "kube-api-access-fdjp6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.558294 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2da3-account-create-update-jbb94" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.564806 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lf25b" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.576260 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c88c-account-create-update-zlssx" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.610206 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bae2737b-02b8-46f4-9762-842b44b6b506-operator-scripts\") pod \"bae2737b-02b8-46f4-9762-842b44b6b506\" (UID: \"bae2737b-02b8-46f4-9762-842b44b6b506\") " Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.610559 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/041f7579-fdb5-43db-9291-318597c8c028-operator-scripts\") pod \"041f7579-fdb5-43db-9291-318597c8c028\" (UID: \"041f7579-fdb5-43db-9291-318597c8c028\") " Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.610825 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2hb4\" (UniqueName: \"kubernetes.io/projected/bae2737b-02b8-46f4-9762-842b44b6b506-kube-api-access-g2hb4\") pod \"bae2737b-02b8-46f4-9762-842b44b6b506\" (UID: \"bae2737b-02b8-46f4-9762-842b44b6b506\") " Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.610856 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmwfw\" (UniqueName: \"kubernetes.io/projected/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-kube-api-access-tmwfw\") pod \"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b\" (UID: \"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b\") " Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.610953 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-operator-scripts\") pod \"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b\" (UID: \"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b\") " Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.610981 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fzf9\" (UniqueName: \"kubernetes.io/projected/041f7579-fdb5-43db-9291-318597c8c028-kube-api-access-4fzf9\") pod \"041f7579-fdb5-43db-9291-318597c8c028\" (UID: \"041f7579-fdb5-43db-9291-318597c8c028\") " Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.611087 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bae2737b-02b8-46f4-9762-842b44b6b506-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bae2737b-02b8-46f4-9762-842b44b6b506" (UID: "bae2737b-02b8-46f4-9762-842b44b6b506"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.611196 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/041f7579-fdb5-43db-9291-318597c8c028-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "041f7579-fdb5-43db-9291-318597c8c028" (UID: "041f7579-fdb5-43db-9291-318597c8c028"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.611816 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fdjp6\" (UniqueName: \"kubernetes.io/projected/afe07d76-6af2-408e-a77d-45434eaa4eb3-kube-api-access-fdjp6\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.611846 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bae2737b-02b8-46f4-9762-842b44b6b506-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.611860 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/041f7579-fdb5-43db-9291-318597c8c028-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.611872 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afe07d76-6af2-408e-a77d-45434eaa4eb3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.615886 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae2737b-02b8-46f4-9762-842b44b6b506-kube-api-access-g2hb4" (OuterVolumeSpecName: "kube-api-access-g2hb4") pod "bae2737b-02b8-46f4-9762-842b44b6b506" (UID: "bae2737b-02b8-46f4-9762-842b44b6b506"). InnerVolumeSpecName "kube-api-access-g2hb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.616286 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd5d12b8-0fdd-4998-8dbd-8df30df4af5b" (UID: "bd5d12b8-0fdd-4998-8dbd-8df30df4af5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.618672 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-kube-api-access-tmwfw" (OuterVolumeSpecName: "kube-api-access-tmwfw") pod "bd5d12b8-0fdd-4998-8dbd-8df30df4af5b" (UID: "bd5d12b8-0fdd-4998-8dbd-8df30df4af5b"). InnerVolumeSpecName "kube-api-access-tmwfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.630954 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/041f7579-fdb5-43db-9291-318597c8c028-kube-api-access-4fzf9" (OuterVolumeSpecName: "kube-api-access-4fzf9") pod "041f7579-fdb5-43db-9291-318597c8c028" (UID: "041f7579-fdb5-43db-9291-318597c8c028"). InnerVolumeSpecName "kube-api-access-4fzf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.714853 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g2hb4\" (UniqueName: \"kubernetes.io/projected/bae2737b-02b8-46f4-9762-842b44b6b506-kube-api-access-g2hb4\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.714900 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmwfw\" (UniqueName: \"kubernetes.io/projected/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-kube-api-access-tmwfw\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.714912 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.714924 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4fzf9\" (UniqueName: \"kubernetes.io/projected/041f7579-fdb5-43db-9291-318597c8c028-kube-api-access-4fzf9\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.976321 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c88c-account-create-update-zlssx" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.976343 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c88c-account-create-update-zlssx" event={"ID":"bd5d12b8-0fdd-4998-8dbd-8df30df4af5b","Type":"ContainerDied","Data":"3213f8be5a699690beb762e0d894855751da12abfc9d875f2a3a33a10346f0a8"} Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.976589 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3213f8be5a699690beb762e0d894855751da12abfc9d875f2a3a33a10346f0a8" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.985470 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sqldm" event={"ID":"afe07d76-6af2-408e-a77d-45434eaa4eb3","Type":"ContainerDied","Data":"cfe8ce4f750fdff8bf2ea14dfa072cdb474445f1f35347ca3e859e3ddf352ae8"} Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.985974 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfe8ce4f750fdff8bf2ea14dfa072cdb474445f1f35347ca3e859e3ddf352ae8" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.986123 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sqldm" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.991038 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2da3-account-create-update-jbb94" event={"ID":"bae2737b-02b8-46f4-9762-842b44b6b506","Type":"ContainerDied","Data":"6a1f7e0234e8b0352e945c1e2832ebdd584262f503722add25c607d1b2657114"} Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.991089 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a1f7e0234e8b0352e945c1e2832ebdd584262f503722add25c607d1b2657114" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.991168 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2da3-account-create-update-jbb94" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.995978 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lf25b" event={"ID":"041f7579-fdb5-43db-9291-318597c8c028","Type":"ContainerDied","Data":"898500b638f2a4199d7a84c50f4dc676e1f0c833d9eeb87adae1e8f7c43b8064"} Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.996076 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="898500b638f2a4199d7a84c50f4dc676e1f0c833d9eeb87adae1e8f7c43b8064" Mar 12 17:07:41 crc kubenswrapper[5184]: I0312 17:07:41.996013 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lf25b" Mar 12 17:07:42 crc kubenswrapper[5184]: I0312 17:07:42.003799 5184 generic.go:358] "Generic (PLEG): container finished" podID="0353bd4c-727d-4c46-8954-29b25872ba5a" containerID="2799d7016e0cf865a8dea115fe19bd83b26ddae4191f174bc063251d7e9cdb7c" exitCode=0 Mar 12 17:07:42 crc kubenswrapper[5184]: I0312 17:07:42.003906 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nwctk" event={"ID":"0353bd4c-727d-4c46-8954-29b25872ba5a","Type":"ContainerDied","Data":"2799d7016e0cf865a8dea115fe19bd83b26ddae4191f174bc063251d7e9cdb7c"} Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.292515 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7fxbs"] Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293572 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd5d12b8-0fdd-4998-8dbd-8df30df4af5b" containerName="mariadb-account-create-update" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293593 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5d12b8-0fdd-4998-8dbd-8df30df4af5b" containerName="mariadb-account-create-update" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293613 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="041f7579-fdb5-43db-9291-318597c8c028" containerName="mariadb-database-create" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293621 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="041f7579-fdb5-43db-9291-318597c8c028" containerName="mariadb-database-create" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293633 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36632b13-d3b1-4c31-864c-c6f0d31cb057" containerName="dnsmasq-dns" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293640 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="36632b13-d3b1-4c31-864c-c6f0d31cb057" containerName="dnsmasq-dns" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293661 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36632b13-d3b1-4c31-864c-c6f0d31cb057" containerName="init" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293668 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="36632b13-d3b1-4c31-864c-c6f0d31cb057" containerName="init" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293687 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bae2737b-02b8-46f4-9762-842b44b6b506" containerName="mariadb-account-create-update" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293695 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae2737b-02b8-46f4-9762-842b44b6b506" containerName="mariadb-account-create-update" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293721 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afe07d76-6af2-408e-a77d-45434eaa4eb3" containerName="mariadb-database-create" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293728 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe07d76-6af2-408e-a77d-45434eaa4eb3" containerName="mariadb-database-create" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293886 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="afe07d76-6af2-408e-a77d-45434eaa4eb3" containerName="mariadb-database-create" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293897 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd5d12b8-0fdd-4998-8dbd-8df30df4af5b" containerName="mariadb-account-create-update" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293914 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="36632b13-d3b1-4c31-864c-c6f0d31cb057" containerName="dnsmasq-dns" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293927 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="bae2737b-02b8-46f4-9762-842b44b6b506" containerName="mariadb-account-create-update" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.293937 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="041f7579-fdb5-43db-9291-318597c8c028" containerName="mariadb-database-create" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.369257 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.447961 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-dispersionconf\") pod \"0353bd4c-727d-4c46-8954-29b25872ba5a\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.448090 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0353bd4c-727d-4c46-8954-29b25872ba5a-etc-swift\") pod \"0353bd4c-727d-4c46-8954-29b25872ba5a\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.448139 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-ring-data-devices\") pod \"0353bd4c-727d-4c46-8954-29b25872ba5a\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.448198 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-swiftconf\") pod \"0353bd4c-727d-4c46-8954-29b25872ba5a\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.448227 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njk6f\" (UniqueName: \"kubernetes.io/projected/0353bd4c-727d-4c46-8954-29b25872ba5a-kube-api-access-njk6f\") pod \"0353bd4c-727d-4c46-8954-29b25872ba5a\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.448276 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-scripts\") pod \"0353bd4c-727d-4c46-8954-29b25872ba5a\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.448498 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-combined-ca-bundle\") pod \"0353bd4c-727d-4c46-8954-29b25872ba5a\" (UID: \"0353bd4c-727d-4c46-8954-29b25872ba5a\") " Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.450704 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0353bd4c-727d-4c46-8954-29b25872ba5a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0353bd4c-727d-4c46-8954-29b25872ba5a" (UID: "0353bd4c-727d-4c46-8954-29b25872ba5a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.451403 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0353bd4c-727d-4c46-8954-29b25872ba5a" (UID: "0353bd4c-727d-4c46-8954-29b25872ba5a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.454283 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0353bd4c-727d-4c46-8954-29b25872ba5a-kube-api-access-njk6f" (OuterVolumeSpecName: "kube-api-access-njk6f") pod "0353bd4c-727d-4c46-8954-29b25872ba5a" (UID: "0353bd4c-727d-4c46-8954-29b25872ba5a"). InnerVolumeSpecName "kube-api-access-njk6f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.459479 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0353bd4c-727d-4c46-8954-29b25872ba5a" (UID: "0353bd4c-727d-4c46-8954-29b25872ba5a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.469314 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7fxbs"] Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.469704 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.472009 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-scripts" (OuterVolumeSpecName: "scripts") pod "0353bd4c-727d-4c46-8954-29b25872ba5a" (UID: "0353bd4c-727d-4c46-8954-29b25872ba5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.473979 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-config-data\"" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.474645 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-glance-dockercfg-kvq4j\"" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.480804 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0353bd4c-727d-4c46-8954-29b25872ba5a" (UID: "0353bd4c-727d-4c46-8954-29b25872ba5a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.492034 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0353bd4c-727d-4c46-8954-29b25872ba5a" (UID: "0353bd4c-727d-4c46-8954-29b25872ba5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.550504 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-db-sync-config-data\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.550561 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-combined-ca-bundle\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.550591 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-config-data\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.550807 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz6ln\" (UniqueName: \"kubernetes.io/projected/2f4b74dc-78a2-4b8c-8b52-cb972e894961-kube-api-access-hz6ln\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.551019 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.551033 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.551043 5184 reconciler_common.go:299] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.551053 5184 reconciler_common.go:299] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0353bd4c-727d-4c46-8954-29b25872ba5a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.551061 5184 reconciler_common.go:299] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0353bd4c-727d-4c46-8954-29b25872ba5a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.551069 5184 reconciler_common.go:299] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0353bd4c-727d-4c46-8954-29b25872ba5a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.551076 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-njk6f\" (UniqueName: \"kubernetes.io/projected/0353bd4c-727d-4c46-8954-29b25872ba5a-kube-api-access-njk6f\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.652854 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-config-data\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.652942 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz6ln\" (UniqueName: \"kubernetes.io/projected/2f4b74dc-78a2-4b8c-8b52-cb972e894961-kube-api-access-hz6ln\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.653031 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-db-sync-config-data\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.653055 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-combined-ca-bundle\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.657767 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-combined-ca-bundle\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.657862 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-db-sync-config-data\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.660247 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-config-data\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.683608 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz6ln\" (UniqueName: \"kubernetes.io/projected/2f4b74dc-78a2-4b8c-8b52-cb972e894961-kube-api-access-hz6ln\") pod \"glance-db-sync-7fxbs\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:43 crc kubenswrapper[5184]: I0312 17:07:43.858990 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7fxbs" Mar 12 17:07:44 crc kubenswrapper[5184]: I0312 17:07:44.037001 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nwctk" Mar 12 17:07:44 crc kubenswrapper[5184]: I0312 17:07:44.037673 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nwctk" event={"ID":"0353bd4c-727d-4c46-8954-29b25872ba5a","Type":"ContainerDied","Data":"d00f481b6a403aa2e080e729d3852859fa546414df454941765f92068faef8ee"} Mar 12 17:07:44 crc kubenswrapper[5184]: I0312 17:07:44.037708 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d00f481b6a403aa2e080e729d3852859fa546414df454941765f92068faef8ee" Mar 12 17:07:44 crc kubenswrapper[5184]: I0312 17:07:44.450186 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7fxbs"] Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.047120 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7fxbs" event={"ID":"2f4b74dc-78a2-4b8c-8b52-cb972e894961","Type":"ContainerStarted","Data":"4514b87ab42560bca1cb8057ec6b8e6888c8365e7b8f1271c699a248bb073ae7"} Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.120993 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-nwvrf"] Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.122843 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0353bd4c-727d-4c46-8954-29b25872ba5a" containerName="swift-ring-rebalance" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.122882 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="0353bd4c-727d-4c46-8954-29b25872ba5a" containerName="swift-ring-rebalance" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.123197 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="0353bd4c-727d-4c46-8954-29b25872ba5a" containerName="swift-ring-rebalance" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.134738 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nwvrf" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.135926 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nwvrf"] Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.136709 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-cell1-mariadb-root-db-secret\"" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.184088 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw2dr\" (UniqueName: \"kubernetes.io/projected/d85d1216-e8c4-45ea-8e85-bf33cece093c-kube-api-access-jw2dr\") pod \"root-account-create-update-nwvrf\" (UID: \"d85d1216-e8c4-45ea-8e85-bf33cece093c\") " pod="openstack/root-account-create-update-nwvrf" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.184167 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d85d1216-e8c4-45ea-8e85-bf33cece093c-operator-scripts\") pod \"root-account-create-update-nwvrf\" (UID: \"d85d1216-e8c4-45ea-8e85-bf33cece093c\") " pod="openstack/root-account-create-update-nwvrf" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.287691 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw2dr\" (UniqueName: \"kubernetes.io/projected/d85d1216-e8c4-45ea-8e85-bf33cece093c-kube-api-access-jw2dr\") pod \"root-account-create-update-nwvrf\" (UID: \"d85d1216-e8c4-45ea-8e85-bf33cece093c\") " pod="openstack/root-account-create-update-nwvrf" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.287806 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d85d1216-e8c4-45ea-8e85-bf33cece093c-operator-scripts\") pod \"root-account-create-update-nwvrf\" (UID: \"d85d1216-e8c4-45ea-8e85-bf33cece093c\") " pod="openstack/root-account-create-update-nwvrf" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.292436 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d85d1216-e8c4-45ea-8e85-bf33cece093c-operator-scripts\") pod \"root-account-create-update-nwvrf\" (UID: \"d85d1216-e8c4-45ea-8e85-bf33cece093c\") " pod="openstack/root-account-create-update-nwvrf" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.321687 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw2dr\" (UniqueName: \"kubernetes.io/projected/d85d1216-e8c4-45ea-8e85-bf33cece093c-kube-api-access-jw2dr\") pod \"root-account-create-update-nwvrf\" (UID: \"d85d1216-e8c4-45ea-8e85-bf33cece093c\") " pod="openstack/root-account-create-update-nwvrf" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.456215 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nwvrf" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.899581 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.902028 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nwvrf"] Mar 12 17:07:45 crc kubenswrapper[5184]: I0312 17:07:45.914471 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2ffae81b-589d-4502-a0a6-777b8d6f98b1-etc-swift\") pod \"swift-storage-0\" (UID: \"2ffae81b-589d-4502-a0a6-777b8d6f98b1\") " pod="openstack/swift-storage-0" Mar 12 17:07:46 crc kubenswrapper[5184]: I0312 17:07:46.056740 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nwvrf" event={"ID":"d85d1216-e8c4-45ea-8e85-bf33cece093c","Type":"ContainerStarted","Data":"cb611d9f19a543e1eb6705897343ec844d165959f8e344321d21bf618d565c0e"} Mar 12 17:07:46 crc kubenswrapper[5184]: I0312 17:07:46.203574 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 12 17:07:46 crc kubenswrapper[5184]: I0312 17:07:46.678836 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dq7bv" podUID="d5a0c031-5c42-4559-96f2-82b75e70b804" containerName="ovn-controller" probeResult="failure" output=< Mar 12 17:07:46 crc kubenswrapper[5184]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 12 17:07:46 crc kubenswrapper[5184]: > Mar 12 17:07:46 crc kubenswrapper[5184]: I0312 17:07:46.754046 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 12 17:07:47 crc kubenswrapper[5184]: I0312 17:07:47.067524 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"3909b39cc90a09430918dce6703b7365c361ba904e284d7813e86c28bc7c11f5"} Mar 12 17:07:47 crc kubenswrapper[5184]: I0312 17:07:47.069509 5184 generic.go:358] "Generic (PLEG): container finished" podID="d85d1216-e8c4-45ea-8e85-bf33cece093c" containerID="e444ed8a4cf30be194f3f9e7a76611008eb92112aae4c93388bf7df409a63da2" exitCode=0 Mar 12 17:07:47 crc kubenswrapper[5184]: I0312 17:07:47.069551 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nwvrf" event={"ID":"d85d1216-e8c4-45ea-8e85-bf33cece093c","Type":"ContainerDied","Data":"e444ed8a4cf30be194f3f9e7a76611008eb92112aae4c93388bf7df409a63da2"} Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.082254 5184 generic.go:358] "Generic (PLEG): container finished" podID="56b9c26f-b490-4262-9c35-63ee5734c634" containerID="5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22" exitCode=0 Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.082350 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56b9c26f-b490-4262-9c35-63ee5734c634","Type":"ContainerDied","Data":"5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22"} Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.089236 5184 generic.go:358] "Generic (PLEG): container finished" podID="53e57ab8-13e6-4505-a905-412d3ef88083" containerID="1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464" exitCode=0 Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.089401 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53e57ab8-13e6-4505-a905-412d3ef88083","Type":"ContainerDied","Data":"1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464"} Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.094595 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"42596577e83f661468eceee670913eb80f4adfca6cd1f7f44ab7e8392867a914"} Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.514737 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nwvrf" Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.661884 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d85d1216-e8c4-45ea-8e85-bf33cece093c-operator-scripts\") pod \"d85d1216-e8c4-45ea-8e85-bf33cece093c\" (UID: \"d85d1216-e8c4-45ea-8e85-bf33cece093c\") " Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.662055 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw2dr\" (UniqueName: \"kubernetes.io/projected/d85d1216-e8c4-45ea-8e85-bf33cece093c-kube-api-access-jw2dr\") pod \"d85d1216-e8c4-45ea-8e85-bf33cece093c\" (UID: \"d85d1216-e8c4-45ea-8e85-bf33cece093c\") " Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.662331 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d85d1216-e8c4-45ea-8e85-bf33cece093c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d85d1216-e8c4-45ea-8e85-bf33cece093c" (UID: "d85d1216-e8c4-45ea-8e85-bf33cece093c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.672540 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d85d1216-e8c4-45ea-8e85-bf33cece093c-kube-api-access-jw2dr" (OuterVolumeSpecName: "kube-api-access-jw2dr") pod "d85d1216-e8c4-45ea-8e85-bf33cece093c" (UID: "d85d1216-e8c4-45ea-8e85-bf33cece093c"). InnerVolumeSpecName "kube-api-access-jw2dr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.764281 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jw2dr\" (UniqueName: \"kubernetes.io/projected/d85d1216-e8c4-45ea-8e85-bf33cece093c-kube-api-access-jw2dr\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.764317 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d85d1216-e8c4-45ea-8e85-bf33cece093c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:48 crc kubenswrapper[5184]: I0312 17:07:48.782545 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.106346 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56b9c26f-b490-4262-9c35-63ee5734c634","Type":"ContainerStarted","Data":"1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf"} Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.106847 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.110149 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53e57ab8-13e6-4505-a905-412d3ef88083","Type":"ContainerStarted","Data":"de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097"} Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.110523 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/rabbitmq-server-0" Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.114055 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"0911eb2495475f66e736e8e751469eb750bf9f515787c13158c2f0b60b84018c"} Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.114104 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"923fea102aeae5f0068683efd460c311c78eef24459a7bdf21fbe2f416e20286"} Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.114115 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"ef16fb316f57f99ecf2861f3cd79d374f82565c51151f10fb87c6e2eea8a2100"} Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.116010 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nwvrf" Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.116051 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nwvrf" event={"ID":"d85d1216-e8c4-45ea-8e85-bf33cece093c","Type":"ContainerDied","Data":"cb611d9f19a543e1eb6705897343ec844d165959f8e344321d21bf618d565c0e"} Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.116084 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb611d9f19a543e1eb6705897343ec844d165959f8e344321d21bf618d565c0e" Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.131607 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.046058866 podStartE2EDuration="57.131586965s" podCreationTimestamp="2026-03-12 17:06:52 +0000 UTC" firstStartedPulling="2026-03-12 17:07:05.290109862 +0000 UTC m=+967.831421201" lastFinishedPulling="2026-03-12 17:07:13.375637961 +0000 UTC m=+975.916949300" observedRunningTime="2026-03-12 17:07:49.125110312 +0000 UTC m=+1011.666421651" watchObservedRunningTime="2026-03-12 17:07:49.131586965 +0000 UTC m=+1011.672898304" Mar 12 17:07:49 crc kubenswrapper[5184]: I0312 17:07:49.149774 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.025309593 podStartE2EDuration="57.149754237s" podCreationTimestamp="2026-03-12 17:06:52 +0000 UTC" firstStartedPulling="2026-03-12 17:07:04.774184861 +0000 UTC m=+967.315496200" lastFinishedPulling="2026-03-12 17:07:12.898629505 +0000 UTC m=+975.439940844" observedRunningTime="2026-03-12 17:07:49.145592226 +0000 UTC m=+1011.686903585" watchObservedRunningTime="2026-03-12 17:07:49.149754237 +0000 UTC m=+1011.691065576" Mar 12 17:07:49 crc kubenswrapper[5184]: E0312 17:07:49.318773 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1431370 actualBytes=10240 Mar 12 17:07:51 crc kubenswrapper[5184]: I0312 17:07:51.661580 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dq7bv" podUID="d5a0c031-5c42-4559-96f2-82b75e70b804" containerName="ovn-controller" probeResult="failure" output=< Mar 12 17:07:51 crc kubenswrapper[5184]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 12 17:07:51 crc kubenswrapper[5184]: > Mar 12 17:07:53 crc kubenswrapper[5184]: I0312 17:07:53.823329 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vp7v2" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.047047 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dq7bv-config-cbnmd"] Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.048236 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d85d1216-e8c4-45ea-8e85-bf33cece093c" containerName="mariadb-account-create-update" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.048260 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d85d1216-e8c4-45ea-8e85-bf33cece093c" containerName="mariadb-account-create-update" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.048478 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="d85d1216-e8c4-45ea-8e85-bf33cece093c" containerName="mariadb-account-create-update" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.074975 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dq7bv-config-cbnmd"] Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.075203 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.079271 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"ovncontroller-extra-scripts\"" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.262193 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-additional-scripts\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.262247 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-scripts\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.262321 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run-ovn\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.262344 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-log-ovn\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.262362 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.262430 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz88k\" (UniqueName: \"kubernetes.io/projected/aef0c20a-357e-4926-bca6-4d104b8fc9c4-kube-api-access-kz88k\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.364405 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run-ovn\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.364474 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-log-ovn\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.364507 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.364812 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz88k\" (UniqueName: \"kubernetes.io/projected/aef0c20a-357e-4926-bca6-4d104b8fc9c4-kube-api-access-kz88k\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.364861 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.364897 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-additional-scripts\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.365225 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-scripts\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.365258 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-log-ovn\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.364808 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run-ovn\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.365835 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-additional-scripts\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.367524 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-scripts\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.403168 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz88k\" (UniqueName: \"kubernetes.io/projected/aef0c20a-357e-4926-bca6-4d104b8fc9c4-kube-api-access-kz88k\") pod \"ovn-controller-dq7bv-config-cbnmd\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:54 crc kubenswrapper[5184]: I0312 17:07:54.695104 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:56 crc kubenswrapper[5184]: I0312 17:07:56.481285 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dq7bv-config-cbnmd"] Mar 12 17:07:56 crc kubenswrapper[5184]: W0312 17:07:56.489341 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaef0c20a_357e_4926_bca6_4d104b8fc9c4.slice/crio-77ca1c7b16b57ef5518f6da045f6a05d6b3edd38f0d0317e79507a9496e6d3ad WatchSource:0}: Error finding container 77ca1c7b16b57ef5518f6da045f6a05d6b3edd38f0d0317e79507a9496e6d3ad: Status 404 returned error can't find the container with id 77ca1c7b16b57ef5518f6da045f6a05d6b3edd38f0d0317e79507a9496e6d3ad Mar 12 17:07:56 crc kubenswrapper[5184]: I0312 17:07:56.715703 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dq7bv" podUID="d5a0c031-5c42-4559-96f2-82b75e70b804" containerName="ovn-controller" probeResult="failure" output=< Mar 12 17:07:56 crc kubenswrapper[5184]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 12 17:07:56 crc kubenswrapper[5184]: > Mar 12 17:07:57 crc kubenswrapper[5184]: I0312 17:07:57.188451 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7fxbs" event={"ID":"2f4b74dc-78a2-4b8c-8b52-cb972e894961","Type":"ContainerStarted","Data":"a57617551bbfe3d89bcb379f1587ec00ad6d87f2b8ea43063d9858e615e2d4e8"} Mar 12 17:07:57 crc kubenswrapper[5184]: I0312 17:07:57.193494 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"e331dbdb4cc9551c313bb89c0ce86645e1ae6a22a5358a7bd15d6473bb4ea89d"} Mar 12 17:07:57 crc kubenswrapper[5184]: I0312 17:07:57.193524 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"e69d4e86db89ea9c88969b57248b7f6d463dba26a0c2297e8b53a7194ecf9943"} Mar 12 17:07:57 crc kubenswrapper[5184]: I0312 17:07:57.193534 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"4745b5a8595d745b023071f554211e23b9ec7a803406818ddb45fcacd12bc373"} Mar 12 17:07:57 crc kubenswrapper[5184]: I0312 17:07:57.193542 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"97a3ddd678f8bed28c0cc607b76f98abec4e712dfbadebab8c690030011d8c7a"} Mar 12 17:07:57 crc kubenswrapper[5184]: I0312 17:07:57.195421 5184 generic.go:358] "Generic (PLEG): container finished" podID="aef0c20a-357e-4926-bca6-4d104b8fc9c4" containerID="834b98ec3a66b698c2809a195d4a511309154a427bf6f45a2a23c132afcd0771" exitCode=0 Mar 12 17:07:57 crc kubenswrapper[5184]: I0312 17:07:57.195447 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dq7bv-config-cbnmd" event={"ID":"aef0c20a-357e-4926-bca6-4d104b8fc9c4","Type":"ContainerDied","Data":"834b98ec3a66b698c2809a195d4a511309154a427bf6f45a2a23c132afcd0771"} Mar 12 17:07:57 crc kubenswrapper[5184]: I0312 17:07:57.195461 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dq7bv-config-cbnmd" event={"ID":"aef0c20a-357e-4926-bca6-4d104b8fc9c4","Type":"ContainerStarted","Data":"77ca1c7b16b57ef5518f6da045f6a05d6b3edd38f0d0317e79507a9496e6d3ad"} Mar 12 17:07:57 crc kubenswrapper[5184]: I0312 17:07:57.207704 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7fxbs" podStartSLOduration=2.648242148 podStartE2EDuration="14.207684647s" podCreationTimestamp="2026-03-12 17:07:43 +0000 UTC" firstStartedPulling="2026-03-12 17:07:44.460326951 +0000 UTC m=+1007.001638290" lastFinishedPulling="2026-03-12 17:07:56.01976945 +0000 UTC m=+1018.561080789" observedRunningTime="2026-03-12 17:07:57.205837008 +0000 UTC m=+1019.747148367" watchObservedRunningTime="2026-03-12 17:07:57.207684647 +0000 UTC m=+1019.748995986" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.551566 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.634304 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-additional-scripts\") pod \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.634392 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run-ovn\") pod \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.634489 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run\") pod \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.634548 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-log-ovn\") pod \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.634537 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "aef0c20a-357e-4926-bca6-4d104b8fc9c4" (UID: "aef0c20a-357e-4926-bca6-4d104b8fc9c4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.634617 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "aef0c20a-357e-4926-bca6-4d104b8fc9c4" (UID: "aef0c20a-357e-4926-bca6-4d104b8fc9c4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.634646 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run" (OuterVolumeSpecName: "var-run") pod "aef0c20a-357e-4926-bca6-4d104b8fc9c4" (UID: "aef0c20a-357e-4926-bca6-4d104b8fc9c4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.634734 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-scripts\") pod \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.634776 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz88k\" (UniqueName: \"kubernetes.io/projected/aef0c20a-357e-4926-bca6-4d104b8fc9c4-kube-api-access-kz88k\") pod \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\" (UID: \"aef0c20a-357e-4926-bca6-4d104b8fc9c4\") " Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.634908 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "aef0c20a-357e-4926-bca6-4d104b8fc9c4" (UID: "aef0c20a-357e-4926-bca6-4d104b8fc9c4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.635655 5184 reconciler_common.go:299] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.635689 5184 reconciler_common.go:299] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.635708 5184 reconciler_common.go:299] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.635724 5184 reconciler_common.go:299] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aef0c20a-357e-4926-bca6-4d104b8fc9c4-var-run\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.635723 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-scripts" (OuterVolumeSpecName: "scripts") pod "aef0c20a-357e-4926-bca6-4d104b8fc9c4" (UID: "aef0c20a-357e-4926-bca6-4d104b8fc9c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.640929 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef0c20a-357e-4926-bca6-4d104b8fc9c4-kube-api-access-kz88k" (OuterVolumeSpecName: "kube-api-access-kz88k") pod "aef0c20a-357e-4926-bca6-4d104b8fc9c4" (UID: "aef0c20a-357e-4926-bca6-4d104b8fc9c4"). InnerVolumeSpecName "kube-api-access-kz88k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.739091 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aef0c20a-357e-4926-bca6-4d104b8fc9c4-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:58 crc kubenswrapper[5184]: I0312 17:07:58.739151 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kz88k\" (UniqueName: \"kubernetes.io/projected/aef0c20a-357e-4926-bca6-4d104b8fc9c4-kube-api-access-kz88k\") on node \"crc\" DevicePath \"\"" Mar 12 17:07:59 crc kubenswrapper[5184]: I0312 17:07:59.213609 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dq7bv-config-cbnmd" Mar 12 17:07:59 crc kubenswrapper[5184]: I0312 17:07:59.213623 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dq7bv-config-cbnmd" event={"ID":"aef0c20a-357e-4926-bca6-4d104b8fc9c4","Type":"ContainerDied","Data":"77ca1c7b16b57ef5518f6da045f6a05d6b3edd38f0d0317e79507a9496e6d3ad"} Mar 12 17:07:59 crc kubenswrapper[5184]: I0312 17:07:59.214195 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77ca1c7b16b57ef5518f6da045f6a05d6b3edd38f0d0317e79507a9496e6d3ad" Mar 12 17:07:59 crc kubenswrapper[5184]: I0312 17:07:59.683955 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dq7bv-config-cbnmd"] Mar 12 17:07:59 crc kubenswrapper[5184]: I0312 17:07:59.692989 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dq7bv-config-cbnmd"] Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.134068 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555588-xpxvn"] Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.135044 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aef0c20a-357e-4926-bca6-4d104b8fc9c4" containerName="ovn-config" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.135063 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef0c20a-357e-4926-bca6-4d104b8fc9c4" containerName="ovn-config" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.135235 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="aef0c20a-357e-4926-bca6-4d104b8fc9c4" containerName="ovn-config" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.277180 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.277345 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555588-xpxvn"] Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.277539 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555588-xpxvn" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.278539 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.287062 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.289578 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.289824 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.371225 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxfpm\" (UniqueName: \"kubernetes.io/projected/552cda96-d016-4ff4-9bc2-9cf835b31dfe-kube-api-access-wxfpm\") pod \"auto-csr-approver-29555588-xpxvn\" (UID: \"552cda96-d016-4ff4-9bc2-9cf835b31dfe\") " pod="openshift-infra/auto-csr-approver-29555588-xpxvn" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.421580 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef0c20a-357e-4926-bca6-4d104b8fc9c4" path="/var/lib/kubelet/pods/aef0c20a-357e-4926-bca6-4d104b8fc9c4/volumes" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.477465 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxfpm\" (UniqueName: \"kubernetes.io/projected/552cda96-d016-4ff4-9bc2-9cf835b31dfe-kube-api-access-wxfpm\") pod \"auto-csr-approver-29555588-xpxvn\" (UID: \"552cda96-d016-4ff4-9bc2-9cf835b31dfe\") " pod="openshift-infra/auto-csr-approver-29555588-xpxvn" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.524301 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxfpm\" (UniqueName: \"kubernetes.io/projected/552cda96-d016-4ff4-9bc2-9cf835b31dfe-kube-api-access-wxfpm\") pod \"auto-csr-approver-29555588-xpxvn\" (UID: \"552cda96-d016-4ff4-9bc2-9cf835b31dfe\") " pod="openshift-infra/auto-csr-approver-29555588-xpxvn" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.614442 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555588-xpxvn" Mar 12 17:08:00 crc kubenswrapper[5184]: I0312 17:08:00.714025 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xfj2g"] Mar 12 17:08:01 crc kubenswrapper[5184]: W0312 17:08:01.228491 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod552cda96_d016_4ff4_9bc2_9cf835b31dfe.slice/crio-3b366d631d4b86c8ceb0b2f9dc043169ac86cb1a1d06f0868f12ad4155f22365 WatchSource:0}: Error finding container 3b366d631d4b86c8ceb0b2f9dc043169ac86cb1a1d06f0868f12ad4155f22365: Status 404 returned error can't find the container with id 3b366d631d4b86c8ceb0b2f9dc043169ac86cb1a1d06f0868f12ad4155f22365 Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.376739 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xfj2g"] Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.376777 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7g4dn"] Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.376944 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xfj2g" Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.407843 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xgr6\" (UniqueName: \"kubernetes.io/projected/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-kube-api-access-5xgr6\") pod \"cinder-db-create-xfj2g\" (UID: \"86dddeb4-4bdf-4457-ae72-4e42fe713b7d\") " pod="openstack/cinder-db-create-xfj2g" Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.407902 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-operator-scripts\") pod \"cinder-db-create-xfj2g\" (UID: \"86dddeb4-4bdf-4457-ae72-4e42fe713b7d\") " pod="openstack/cinder-db-create-xfj2g" Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.509557 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xgr6\" (UniqueName: \"kubernetes.io/projected/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-kube-api-access-5xgr6\") pod \"cinder-db-create-xfj2g\" (UID: \"86dddeb4-4bdf-4457-ae72-4e42fe713b7d\") " pod="openstack/cinder-db-create-xfj2g" Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.509721 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-operator-scripts\") pod \"cinder-db-create-xfj2g\" (UID: \"86dddeb4-4bdf-4457-ae72-4e42fe713b7d\") " pod="openstack/cinder-db-create-xfj2g" Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.511098 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-operator-scripts\") pod \"cinder-db-create-xfj2g\" (UID: \"86dddeb4-4bdf-4457-ae72-4e42fe713b7d\") " pod="openstack/cinder-db-create-xfj2g" Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.533458 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xgr6\" (UniqueName: \"kubernetes.io/projected/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-kube-api-access-5xgr6\") pod \"cinder-db-create-xfj2g\" (UID: \"86dddeb4-4bdf-4457-ae72-4e42fe713b7d\") " pod="openstack/cinder-db-create-xfj2g" Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.705465 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xfj2g" Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.971308 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-66a8-account-create-update-8jj7v"] Mar 12 17:08:01 crc kubenswrapper[5184]: I0312 17:08:01.971528 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7g4dn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.018750 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fdadd77-2670-41df-b20b-57b771031dde-operator-scripts\") pod \"barbican-db-create-7g4dn\" (UID: \"4fdadd77-2670-41df-b20b-57b771031dde\") " pod="openstack/barbican-db-create-7g4dn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.018852 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgmps\" (UniqueName: \"kubernetes.io/projected/4fdadd77-2670-41df-b20b-57b771031dde-kube-api-access-mgmps\") pod \"barbican-db-create-7g4dn\" (UID: \"4fdadd77-2670-41df-b20b-57b771031dde\") " pod="openstack/barbican-db-create-7g4dn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.119927 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fdadd77-2670-41df-b20b-57b771031dde-operator-scripts\") pod \"barbican-db-create-7g4dn\" (UID: \"4fdadd77-2670-41df-b20b-57b771031dde\") " pod="openstack/barbican-db-create-7g4dn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.120286 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgmps\" (UniqueName: \"kubernetes.io/projected/4fdadd77-2670-41df-b20b-57b771031dde-kube-api-access-mgmps\") pod \"barbican-db-create-7g4dn\" (UID: \"4fdadd77-2670-41df-b20b-57b771031dde\") " pod="openstack/barbican-db-create-7g4dn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.120813 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fdadd77-2670-41df-b20b-57b771031dde-operator-scripts\") pod \"barbican-db-create-7g4dn\" (UID: \"4fdadd77-2670-41df-b20b-57b771031dde\") " pod="openstack/barbican-db-create-7g4dn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.139579 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgmps\" (UniqueName: \"kubernetes.io/projected/4fdadd77-2670-41df-b20b-57b771031dde-kube-api-access-mgmps\") pod \"barbican-db-create-7g4dn\" (UID: \"4fdadd77-2670-41df-b20b-57b771031dde\") " pod="openstack/barbican-db-create-7g4dn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.155513 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-dq7bv" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.155549 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7g4dn"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.155592 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-66a8-account-create-update-8jj7v"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.155612 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-66a8-account-create-update-8jj7v" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.155717 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2mwtn"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.159379 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-db-secret\"" Mar 12 17:08:02 crc kubenswrapper[5184]: W0312 17:08:02.173572 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86dddeb4_4bdf_4457_ae72_4e42fe713b7d.slice/crio-8d0c26a86e91cfe10044d71946dd44a8ad12817f811c9b04a7bdce09ca688396 WatchSource:0}: Error finding container 8d0c26a86e91cfe10044d71946dd44a8ad12817f811c9b04a7bdce09ca688396: Status 404 returned error can't find the container with id 8d0c26a86e91cfe10044d71946dd44a8ad12817f811c9b04a7bdce09ca688396 Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.221576 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcpcv\" (UniqueName: \"kubernetes.io/projected/69a595e1-ca3b-4932-b9b6-c1c0a237a783-kube-api-access-hcpcv\") pod \"cinder-66a8-account-create-update-8jj7v\" (UID: \"69a595e1-ca3b-4932-b9b6-c1c0a237a783\") " pod="openstack/cinder-66a8-account-create-update-8jj7v" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.221920 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69a595e1-ca3b-4932-b9b6-c1c0a237a783-operator-scripts\") pod \"cinder-66a8-account-create-update-8jj7v\" (UID: \"69a595e1-ca3b-4932-b9b6-c1c0a237a783\") " pod="openstack/cinder-66a8-account-create-update-8jj7v" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.298708 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7g4dn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.323893 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcpcv\" (UniqueName: \"kubernetes.io/projected/69a595e1-ca3b-4932-b9b6-c1c0a237a783-kube-api-access-hcpcv\") pod \"cinder-66a8-account-create-update-8jj7v\" (UID: \"69a595e1-ca3b-4932-b9b6-c1c0a237a783\") " pod="openstack/cinder-66a8-account-create-update-8jj7v" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.324075 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69a595e1-ca3b-4932-b9b6-c1c0a237a783-operator-scripts\") pod \"cinder-66a8-account-create-update-8jj7v\" (UID: \"69a595e1-ca3b-4932-b9b6-c1c0a237a783\") " pod="openstack/cinder-66a8-account-create-update-8jj7v" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.324978 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69a595e1-ca3b-4932-b9b6-c1c0a237a783-operator-scripts\") pod \"cinder-66a8-account-create-update-8jj7v\" (UID: \"69a595e1-ca3b-4932-b9b6-c1c0a237a783\") " pod="openstack/cinder-66a8-account-create-update-8jj7v" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.337826 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2mwtn"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.337867 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-bb83-account-create-update-szjsd"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.338454 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2mwtn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.345527 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcpcv\" (UniqueName: \"kubernetes.io/projected/69a595e1-ca3b-4932-b9b6-c1c0a237a783-kube-api-access-hcpcv\") pod \"cinder-66a8-account-create-update-8jj7v\" (UID: \"69a595e1-ca3b-4932-b9b6-c1c0a237a783\") " pod="openstack/cinder-66a8-account-create-update-8jj7v" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.425354 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e50fc1-0b61-44f4-922a-acb08efb0796-operator-scripts\") pod \"neutron-db-create-2mwtn\" (UID: \"21e50fc1-0b61-44f4-922a-acb08efb0796\") " pod="openstack/neutron-db-create-2mwtn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.425689 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2qrq\" (UniqueName: \"kubernetes.io/projected/21e50fc1-0b61-44f4-922a-acb08efb0796-kube-api-access-t2qrq\") pod \"neutron-db-create-2mwtn\" (UID: \"21e50fc1-0b61-44f4-922a-acb08efb0796\") " pod="openstack/neutron-db-create-2mwtn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.506698 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb83-account-create-update-szjsd" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.511123 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-db-secret\"" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.528706 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e50fc1-0b61-44f4-922a-acb08efb0796-operator-scripts\") pod \"neutron-db-create-2mwtn\" (UID: \"21e50fc1-0b61-44f4-922a-acb08efb0796\") " pod="openstack/neutron-db-create-2mwtn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.528751 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2qrq\" (UniqueName: \"kubernetes.io/projected/21e50fc1-0b61-44f4-922a-acb08efb0796-kube-api-access-t2qrq\") pod \"neutron-db-create-2mwtn\" (UID: \"21e50fc1-0b61-44f4-922a-acb08efb0796\") " pod="openstack/neutron-db-create-2mwtn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.529473 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e50fc1-0b61-44f4-922a-acb08efb0796-operator-scripts\") pod \"neutron-db-create-2mwtn\" (UID: \"21e50fc1-0b61-44f4-922a-acb08efb0796\") " pod="openstack/neutron-db-create-2mwtn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.531994 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-66a8-account-create-update-8jj7v" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.557472 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555588-xpxvn" event={"ID":"552cda96-d016-4ff4-9bc2-9cf835b31dfe","Type":"ContainerStarted","Data":"3b366d631d4b86c8ceb0b2f9dc043169ac86cb1a1d06f0868f12ad4155f22365"} Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.557522 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xfj2g" event={"ID":"86dddeb4-4bdf-4457-ae72-4e42fe713b7d","Type":"ContainerStarted","Data":"8d0c26a86e91cfe10044d71946dd44a8ad12817f811c9b04a7bdce09ca688396"} Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.557537 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bb83-account-create-update-szjsd"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.557553 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-571d-account-create-update-n2749"] Mar 12 17:08:02 crc kubenswrapper[5184]: W0312 17:08:02.577651 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fdadd77_2670_41df_b20b_57b771031dde.slice/crio-7f242abe434a2f1578b3839f754019f2b192e619545b9c2a6965bc033e3d6102 WatchSource:0}: Error finding container 7f242abe434a2f1578b3839f754019f2b192e619545b9c2a6965bc033e3d6102: Status 404 returned error can't find the container with id 7f242abe434a2f1578b3839f754019f2b192e619545b9c2a6965bc033e3d6102 Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.577699 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2qrq\" (UniqueName: \"kubernetes.io/projected/21e50fc1-0b61-44f4-922a-acb08efb0796-kube-api-access-t2qrq\") pod \"neutron-db-create-2mwtn\" (UID: \"21e50fc1-0b61-44f4-922a-acb08efb0796\") " pod="openstack/neutron-db-create-2mwtn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.630842 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa5400c-5bc3-4cd5-849d-87105da3827b-operator-scripts\") pod \"neutron-bb83-account-create-update-szjsd\" (UID: \"2aa5400c-5bc3-4cd5-849d-87105da3827b\") " pod="openstack/neutron-bb83-account-create-update-szjsd" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.631167 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2npr\" (UniqueName: \"kubernetes.io/projected/2aa5400c-5bc3-4cd5-849d-87105da3827b-kube-api-access-c2npr\") pod \"neutron-bb83-account-create-update-szjsd\" (UID: \"2aa5400c-5bc3-4cd5-849d-87105da3827b\") " pod="openstack/neutron-bb83-account-create-update-szjsd" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.637707 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-571d-account-create-update-n2749"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.637743 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4nkwm"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.637827 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-571d-account-create-update-n2749" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.639889 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-db-secret\"" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.646246 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4nkwm"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.646280 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555588-xpxvn"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.646297 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xfj2g"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.646319 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7g4dn"] Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.646429 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.648028 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.648199 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.649151 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-4s8pv\"" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.649589 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.660230 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2mwtn" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.732994 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxbhr\" (UniqueName: \"kubernetes.io/projected/5ec5e94e-bd18-444a-9340-de9b41934458-kube-api-access-wxbhr\") pod \"barbican-571d-account-create-update-n2749\" (UID: \"5ec5e94e-bd18-444a-9340-de9b41934458\") " pod="openstack/barbican-571d-account-create-update-n2749" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.733048 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-combined-ca-bundle\") pod \"keystone-db-sync-4nkwm\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.733088 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa5400c-5bc3-4cd5-849d-87105da3827b-operator-scripts\") pod \"neutron-bb83-account-create-update-szjsd\" (UID: \"2aa5400c-5bc3-4cd5-849d-87105da3827b\") " pod="openstack/neutron-bb83-account-create-update-szjsd" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.733256 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-config-data\") pod \"keystone-db-sync-4nkwm\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.733348 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec5e94e-bd18-444a-9340-de9b41934458-operator-scripts\") pod \"barbican-571d-account-create-update-n2749\" (UID: \"5ec5e94e-bd18-444a-9340-de9b41934458\") " pod="openstack/barbican-571d-account-create-update-n2749" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.733619 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2npr\" (UniqueName: \"kubernetes.io/projected/2aa5400c-5bc3-4cd5-849d-87105da3827b-kube-api-access-c2npr\") pod \"neutron-bb83-account-create-update-szjsd\" (UID: \"2aa5400c-5bc3-4cd5-849d-87105da3827b\") " pod="openstack/neutron-bb83-account-create-update-szjsd" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.733662 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6nll\" (UniqueName: \"kubernetes.io/projected/f64b215c-973c-4761-9b13-0510387973ee-kube-api-access-h6nll\") pod \"keystone-db-sync-4nkwm\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.733847 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa5400c-5bc3-4cd5-849d-87105da3827b-operator-scripts\") pod \"neutron-bb83-account-create-update-szjsd\" (UID: \"2aa5400c-5bc3-4cd5-849d-87105da3827b\") " pod="openstack/neutron-bb83-account-create-update-szjsd" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.752734 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2npr\" (UniqueName: \"kubernetes.io/projected/2aa5400c-5bc3-4cd5-849d-87105da3827b-kube-api-access-c2npr\") pod \"neutron-bb83-account-create-update-szjsd\" (UID: \"2aa5400c-5bc3-4cd5-849d-87105da3827b\") " pod="openstack/neutron-bb83-account-create-update-szjsd" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.828642 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb83-account-create-update-szjsd" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.835405 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxbhr\" (UniqueName: \"kubernetes.io/projected/5ec5e94e-bd18-444a-9340-de9b41934458-kube-api-access-wxbhr\") pod \"barbican-571d-account-create-update-n2749\" (UID: \"5ec5e94e-bd18-444a-9340-de9b41934458\") " pod="openstack/barbican-571d-account-create-update-n2749" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.835450 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-combined-ca-bundle\") pod \"keystone-db-sync-4nkwm\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.835498 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-config-data\") pod \"keystone-db-sync-4nkwm\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.835524 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec5e94e-bd18-444a-9340-de9b41934458-operator-scripts\") pod \"barbican-571d-account-create-update-n2749\" (UID: \"5ec5e94e-bd18-444a-9340-de9b41934458\") " pod="openstack/barbican-571d-account-create-update-n2749" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.835596 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6nll\" (UniqueName: \"kubernetes.io/projected/f64b215c-973c-4761-9b13-0510387973ee-kube-api-access-h6nll\") pod \"keystone-db-sync-4nkwm\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.836615 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec5e94e-bd18-444a-9340-de9b41934458-operator-scripts\") pod \"barbican-571d-account-create-update-n2749\" (UID: \"5ec5e94e-bd18-444a-9340-de9b41934458\") " pod="openstack/barbican-571d-account-create-update-n2749" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.840098 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-combined-ca-bundle\") pod \"keystone-db-sync-4nkwm\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.841605 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-config-data\") pod \"keystone-db-sync-4nkwm\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.853038 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6nll\" (UniqueName: \"kubernetes.io/projected/f64b215c-973c-4761-9b13-0510387973ee-kube-api-access-h6nll\") pod \"keystone-db-sync-4nkwm\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.860464 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxbhr\" (UniqueName: \"kubernetes.io/projected/5ec5e94e-bd18-444a-9340-de9b41934458-kube-api-access-wxbhr\") pod \"barbican-571d-account-create-update-n2749\" (UID: \"5ec5e94e-bd18-444a-9340-de9b41934458\") " pod="openstack/barbican-571d-account-create-update-n2749" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.969385 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-571d-account-create-update-n2749" Mar 12 17:08:02 crc kubenswrapper[5184]: I0312 17:08:02.980103 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.011914 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-66a8-account-create-update-8jj7v"] Mar 12 17:08:03 crc kubenswrapper[5184]: W0312 17:08:03.039747 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69a595e1_ca3b_4932_b9b6_c1c0a237a783.slice/crio-baf98ec641574c9743bb5f5ff44b0ecae5733f4ed275de8906450cb23d1ff785 WatchSource:0}: Error finding container baf98ec641574c9743bb5f5ff44b0ecae5733f4ed275de8906450cb23d1ff785: Status 404 returned error can't find the container with id baf98ec641574c9743bb5f5ff44b0ecae5733f4ed275de8906450cb23d1ff785 Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.178905 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bb83-account-create-update-szjsd"] Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.257280 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb83-account-create-update-szjsd" event={"ID":"2aa5400c-5bc3-4cd5-849d-87105da3827b","Type":"ContainerStarted","Data":"a513cd278f8723af64709861658b71949ab7c91af4a92c7825fc9a7a51084cf4"} Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.258713 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-66a8-account-create-update-8jj7v" event={"ID":"69a595e1-ca3b-4932-b9b6-c1c0a237a783","Type":"ContainerStarted","Data":"baf98ec641574c9743bb5f5ff44b0ecae5733f4ed275de8906450cb23d1ff785"} Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.267472 5184 generic.go:358] "Generic (PLEG): container finished" podID="4fdadd77-2670-41df-b20b-57b771031dde" containerID="190b64d8e31efaaf582c749433b417fc51e6109d2f8b14bb5db31ef71632d0c1" exitCode=0 Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.267879 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7g4dn" event={"ID":"4fdadd77-2670-41df-b20b-57b771031dde","Type":"ContainerDied","Data":"190b64d8e31efaaf582c749433b417fc51e6109d2f8b14bb5db31ef71632d0c1"} Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.267904 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7g4dn" event={"ID":"4fdadd77-2670-41df-b20b-57b771031dde","Type":"ContainerStarted","Data":"7f242abe434a2f1578b3839f754019f2b192e619545b9c2a6965bc033e3d6102"} Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.269910 5184 generic.go:358] "Generic (PLEG): container finished" podID="86dddeb4-4bdf-4457-ae72-4e42fe713b7d" containerID="7365c08678ba86031cf99befe675647320ef5f47affc3d7942e925548c0e2bfd" exitCode=0 Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.270159 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xfj2g" event={"ID":"86dddeb4-4bdf-4457-ae72-4e42fe713b7d","Type":"ContainerDied","Data":"7365c08678ba86031cf99befe675647320ef5f47affc3d7942e925548c0e2bfd"} Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.292240 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4nkwm"] Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.304745 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2mwtn"] Mar 12 17:08:03 crc kubenswrapper[5184]: W0312 17:08:03.305454 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf64b215c_973c_4761_9b13_0510387973ee.slice/crio-671a99e947a75dfbe8efd29b0501052a9fbdf3e010c2c303e3350031553c6bb8 WatchSource:0}: Error finding container 671a99e947a75dfbe8efd29b0501052a9fbdf3e010c2c303e3350031553c6bb8: Status 404 returned error can't find the container with id 671a99e947a75dfbe8efd29b0501052a9fbdf3e010c2c303e3350031553c6bb8 Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.307341 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"15ed489194e97fafe2efa0faddb782c3af3dcca1e3884a2e77a8811bebc276ee"} Mar 12 17:08:03 crc kubenswrapper[5184]: W0312 17:08:03.315243 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e50fc1_0b61_44f4_922a_acb08efb0796.slice/crio-cf52f21c4b22e458968d7e931183e227e45f31555f137076c59943d002712bad WatchSource:0}: Error finding container cf52f21c4b22e458968d7e931183e227e45f31555f137076c59943d002712bad: Status 404 returned error can't find the container with id cf52f21c4b22e458968d7e931183e227e45f31555f137076c59943d002712bad Mar 12 17:08:03 crc kubenswrapper[5184]: I0312 17:08:03.547192 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-571d-account-create-update-n2749"] Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.324139 5184 generic.go:358] "Generic (PLEG): container finished" podID="2aa5400c-5bc3-4cd5-849d-87105da3827b" containerID="2435994606af1d1b9f68d10111012a0a05fd320a41cde861c73a1f0787efa241" exitCode=0 Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.324253 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb83-account-create-update-szjsd" event={"ID":"2aa5400c-5bc3-4cd5-849d-87105da3827b","Type":"ContainerDied","Data":"2435994606af1d1b9f68d10111012a0a05fd320a41cde861c73a1f0787efa241"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.329347 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555588-xpxvn" event={"ID":"552cda96-d016-4ff4-9bc2-9cf835b31dfe","Type":"ContainerStarted","Data":"91ec094eded41029257ca7ab2c02e818b22d55adde67347c58645cfc3ac9ff5e"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.332520 5184 generic.go:358] "Generic (PLEG): container finished" podID="69a595e1-ca3b-4932-b9b6-c1c0a237a783" containerID="43a91d1c28e956ebf8ca465cfcb74f879b9307b5b49d2038b37c88be1b1f3488" exitCode=0 Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.332641 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-66a8-account-create-update-8jj7v" event={"ID":"69a595e1-ca3b-4932-b9b6-c1c0a237a783","Type":"ContainerDied","Data":"43a91d1c28e956ebf8ca465cfcb74f879b9307b5b49d2038b37c88be1b1f3488"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.334494 5184 generic.go:358] "Generic (PLEG): container finished" podID="5ec5e94e-bd18-444a-9340-de9b41934458" containerID="85425b87026fec016d23203e231b782bbf5d697c2deed2097fea43fa3778e354" exitCode=0 Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.334683 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-571d-account-create-update-n2749" event={"ID":"5ec5e94e-bd18-444a-9340-de9b41934458","Type":"ContainerDied","Data":"85425b87026fec016d23203e231b782bbf5d697c2deed2097fea43fa3778e354"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.334709 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-571d-account-create-update-n2749" event={"ID":"5ec5e94e-bd18-444a-9340-de9b41934458","Type":"ContainerStarted","Data":"fd06576138149bac693e67da2d0472c354641c145fd26c93cf26f764643cfd44"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.361348 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"78b8dbca53b18aa4e13553553aaa188cce7712aa70606783132c313f7cb64b5b"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.361404 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"6c33bd9140e6b138ccc9c16f4d526b80fb01f0a6d01c800d1be8805078dfc47f"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.361415 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"0792bf198b6530c0f96a7e2e9c2d55be8003aed50668243ee7f2d58fb108056b"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.361424 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"d1c54e15a863e0555b1f8caf73f82bb0cf8f5c7ad8af7d159e49e197569a9faf"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.361432 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"c8ea37f068d46a77ad18e93515745e4d70e33c56e43b55ee116b56ddc26ba4ab"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.363237 5184 generic.go:358] "Generic (PLEG): container finished" podID="21e50fc1-0b61-44f4-922a-acb08efb0796" containerID="a014aedd868115a04cf195d20b020078b429ad18e7003faa31fa4edafed7d146" exitCode=0 Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.363306 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2mwtn" event={"ID":"21e50fc1-0b61-44f4-922a-acb08efb0796","Type":"ContainerDied","Data":"a014aedd868115a04cf195d20b020078b429ad18e7003faa31fa4edafed7d146"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.363326 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2mwtn" event={"ID":"21e50fc1-0b61-44f4-922a-acb08efb0796","Type":"ContainerStarted","Data":"cf52f21c4b22e458968d7e931183e227e45f31555f137076c59943d002712bad"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.366385 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555588-xpxvn" podStartSLOduration=2.164591015 podStartE2EDuration="4.366363968s" podCreationTimestamp="2026-03-12 17:08:00 +0000 UTC" firstStartedPulling="2026-03-12 17:08:01.230192699 +0000 UTC m=+1023.771504038" lastFinishedPulling="2026-03-12 17:08:03.431965652 +0000 UTC m=+1025.973276991" observedRunningTime="2026-03-12 17:08:04.365687016 +0000 UTC m=+1026.906998355" watchObservedRunningTime="2026-03-12 17:08:04.366363968 +0000 UTC m=+1026.907675307" Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.370197 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nkwm" event={"ID":"f64b215c-973c-4761-9b13-0510387973ee","Type":"ContainerStarted","Data":"671a99e947a75dfbe8efd29b0501052a9fbdf3e010c2c303e3350031553c6bb8"} Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.837832 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7g4dn" Mar 12 17:08:04 crc kubenswrapper[5184]: I0312 17:08:04.852367 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xfj2g" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.001733 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xgr6\" (UniqueName: \"kubernetes.io/projected/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-kube-api-access-5xgr6\") pod \"86dddeb4-4bdf-4457-ae72-4e42fe713b7d\" (UID: \"86dddeb4-4bdf-4457-ae72-4e42fe713b7d\") " Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.001810 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-operator-scripts\") pod \"86dddeb4-4bdf-4457-ae72-4e42fe713b7d\" (UID: \"86dddeb4-4bdf-4457-ae72-4e42fe713b7d\") " Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.001847 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgmps\" (UniqueName: \"kubernetes.io/projected/4fdadd77-2670-41df-b20b-57b771031dde-kube-api-access-mgmps\") pod \"4fdadd77-2670-41df-b20b-57b771031dde\" (UID: \"4fdadd77-2670-41df-b20b-57b771031dde\") " Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.002049 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fdadd77-2670-41df-b20b-57b771031dde-operator-scripts\") pod \"4fdadd77-2670-41df-b20b-57b771031dde\" (UID: \"4fdadd77-2670-41df-b20b-57b771031dde\") " Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.002226 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86dddeb4-4bdf-4457-ae72-4e42fe713b7d" (UID: "86dddeb4-4bdf-4457-ae72-4e42fe713b7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.002686 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.003442 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fdadd77-2670-41df-b20b-57b771031dde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fdadd77-2670-41df-b20b-57b771031dde" (UID: "4fdadd77-2670-41df-b20b-57b771031dde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.007469 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-kube-api-access-5xgr6" (OuterVolumeSpecName: "kube-api-access-5xgr6") pod "86dddeb4-4bdf-4457-ae72-4e42fe713b7d" (UID: "86dddeb4-4bdf-4457-ae72-4e42fe713b7d"). InnerVolumeSpecName "kube-api-access-5xgr6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.008809 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fdadd77-2670-41df-b20b-57b771031dde-kube-api-access-mgmps" (OuterVolumeSpecName: "kube-api-access-mgmps") pod "4fdadd77-2670-41df-b20b-57b771031dde" (UID: "4fdadd77-2670-41df-b20b-57b771031dde"). InnerVolumeSpecName "kube-api-access-mgmps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.104054 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5xgr6\" (UniqueName: \"kubernetes.io/projected/86dddeb4-4bdf-4457-ae72-4e42fe713b7d-kube-api-access-5xgr6\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.104274 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgmps\" (UniqueName: \"kubernetes.io/projected/4fdadd77-2670-41df-b20b-57b771031dde-kube-api-access-mgmps\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.104284 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fdadd77-2670-41df-b20b-57b771031dde-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.404561 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"2ffae81b-589d-4502-a0a6-777b8d6f98b1","Type":"ContainerStarted","Data":"f28aed215711677387392f9b7e144e277a42128cef4f80c2232dc24835812189"} Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.410474 5184 generic.go:358] "Generic (PLEG): container finished" podID="552cda96-d016-4ff4-9bc2-9cf835b31dfe" containerID="91ec094eded41029257ca7ab2c02e818b22d55adde67347c58645cfc3ac9ff5e" exitCode=0 Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.410535 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555588-xpxvn" event={"ID":"552cda96-d016-4ff4-9bc2-9cf835b31dfe","Type":"ContainerDied","Data":"91ec094eded41029257ca7ab2c02e818b22d55adde67347c58645cfc3ac9ff5e"} Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.414897 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7g4dn" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.416572 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7g4dn" event={"ID":"4fdadd77-2670-41df-b20b-57b771031dde","Type":"ContainerDied","Data":"7f242abe434a2f1578b3839f754019f2b192e619545b9c2a6965bc033e3d6102"} Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.416607 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f242abe434a2f1578b3839f754019f2b192e619545b9c2a6965bc033e3d6102" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.418314 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xfj2g" event={"ID":"86dddeb4-4bdf-4457-ae72-4e42fe713b7d","Type":"ContainerDied","Data":"8d0c26a86e91cfe10044d71946dd44a8ad12817f811c9b04a7bdce09ca688396"} Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.418414 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d0c26a86e91cfe10044d71946dd44a8ad12817f811c9b04a7bdce09ca688396" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.418816 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xfj2g" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.451885 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.453344035 podStartE2EDuration="37.4518601s" podCreationTimestamp="2026-03-12 17:07:28 +0000 UTC" firstStartedPulling="2026-03-12 17:07:46.760949766 +0000 UTC m=+1009.302261095" lastFinishedPulling="2026-03-12 17:08:02.759465821 +0000 UTC m=+1025.300777160" observedRunningTime="2026-03-12 17:08:05.447247594 +0000 UTC m=+1027.988558973" watchObservedRunningTime="2026-03-12 17:08:05.4518601 +0000 UTC m=+1027.993171439" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.752955 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b4dccff87-s64b2"] Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.753894 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86dddeb4-4bdf-4457-ae72-4e42fe713b7d" containerName="mariadb-database-create" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.753907 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dddeb4-4bdf-4457-ae72-4e42fe713b7d" containerName="mariadb-database-create" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.753928 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fdadd77-2670-41df-b20b-57b771031dde" containerName="mariadb-database-create" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.753934 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdadd77-2670-41df-b20b-57b771031dde" containerName="mariadb-database-create" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.754095 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fdadd77-2670-41df-b20b-57b771031dde" containerName="mariadb-database-create" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.754110 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="86dddeb4-4bdf-4457-ae72-4e42fe713b7d" containerName="mariadb-database-create" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.760291 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.763688 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"dns-swift-storage-0\"" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.766263 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b4dccff87-s64b2"] Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.925700 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-sb\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.925985 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-nb\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.926044 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-svc\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.926070 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q29b4\" (UniqueName: \"kubernetes.io/projected/b11174d1-7d67-495b-b0fd-8c9be89452a8-kube-api-access-q29b4\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.926090 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-swift-storage-0\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:05 crc kubenswrapper[5184]: I0312 17:08:05.926122 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-config\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.027670 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-sb\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.027733 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-nb\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.027814 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-svc\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.027848 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q29b4\" (UniqueName: \"kubernetes.io/projected/b11174d1-7d67-495b-b0fd-8c9be89452a8-kube-api-access-q29b4\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.027874 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-swift-storage-0\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.027918 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-config\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.029039 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-config\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.029050 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-sb\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.029834 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-nb\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.030254 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-swift-storage-0\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.031562 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-svc\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.048605 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q29b4\" (UniqueName: \"kubernetes.io/projected/b11174d1-7d67-495b-b0fd-8c9be89452a8-kube-api-access-q29b4\") pod \"dnsmasq-dns-7b4dccff87-s64b2\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:06 crc kubenswrapper[5184]: I0312 17:08:06.086514 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.444206 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb83-account-create-update-szjsd" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.453819 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bb83-account-create-update-szjsd" event={"ID":"2aa5400c-5bc3-4cd5-849d-87105da3827b","Type":"ContainerDied","Data":"a513cd278f8723af64709861658b71949ab7c91af4a92c7825fc9a7a51084cf4"} Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.454122 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a513cd278f8723af64709861658b71949ab7c91af4a92c7825fc9a7a51084cf4" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.454091 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bb83-account-create-update-szjsd" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.456492 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555588-xpxvn" event={"ID":"552cda96-d016-4ff4-9bc2-9cf835b31dfe","Type":"ContainerDied","Data":"3b366d631d4b86c8ceb0b2f9dc043169ac86cb1a1d06f0868f12ad4155f22365"} Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.456534 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b366d631d4b86c8ceb0b2f9dc043169ac86cb1a1d06f0868f12ad4155f22365" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.458862 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-66a8-account-create-update-8jj7v" event={"ID":"69a595e1-ca3b-4932-b9b6-c1c0a237a783","Type":"ContainerDied","Data":"baf98ec641574c9743bb5f5ff44b0ecae5733f4ed275de8906450cb23d1ff785"} Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.458897 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baf98ec641574c9743bb5f5ff44b0ecae5733f4ed275de8906450cb23d1ff785" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.460637 5184 generic.go:358] "Generic (PLEG): container finished" podID="2f4b74dc-78a2-4b8c-8b52-cb972e894961" containerID="a57617551bbfe3d89bcb379f1587ec00ad6d87f2b8ea43063d9858e615e2d4e8" exitCode=0 Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.460714 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7fxbs" event={"ID":"2f4b74dc-78a2-4b8c-8b52-cb972e894961","Type":"ContainerDied","Data":"a57617551bbfe3d89bcb379f1587ec00ad6d87f2b8ea43063d9858e615e2d4e8"} Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.462910 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-571d-account-create-update-n2749" event={"ID":"5ec5e94e-bd18-444a-9340-de9b41934458","Type":"ContainerDied","Data":"fd06576138149bac693e67da2d0472c354641c145fd26c93cf26f764643cfd44"} Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.462947 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd06576138149bac693e67da2d0472c354641c145fd26c93cf26f764643cfd44" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.464403 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2mwtn" event={"ID":"21e50fc1-0b61-44f4-922a-acb08efb0796","Type":"ContainerDied","Data":"cf52f21c4b22e458968d7e931183e227e45f31555f137076c59943d002712bad"} Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.464427 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf52f21c4b22e458968d7e931183e227e45f31555f137076c59943d002712bad" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.480663 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-66a8-account-create-update-8jj7v" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.511835 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-571d-account-create-update-n2749" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.524173 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555588-xpxvn" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.535081 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2mwtn" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.569751 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcpcv\" (UniqueName: \"kubernetes.io/projected/69a595e1-ca3b-4932-b9b6-c1c0a237a783-kube-api-access-hcpcv\") pod \"69a595e1-ca3b-4932-b9b6-c1c0a237a783\" (UID: \"69a595e1-ca3b-4932-b9b6-c1c0a237a783\") " Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.569853 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa5400c-5bc3-4cd5-849d-87105da3827b-operator-scripts\") pod \"2aa5400c-5bc3-4cd5-849d-87105da3827b\" (UID: \"2aa5400c-5bc3-4cd5-849d-87105da3827b\") " Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.569901 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2npr\" (UniqueName: \"kubernetes.io/projected/2aa5400c-5bc3-4cd5-849d-87105da3827b-kube-api-access-c2npr\") pod \"2aa5400c-5bc3-4cd5-849d-87105da3827b\" (UID: \"2aa5400c-5bc3-4cd5-849d-87105da3827b\") " Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.569918 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69a595e1-ca3b-4932-b9b6-c1c0a237a783-operator-scripts\") pod \"69a595e1-ca3b-4932-b9b6-c1c0a237a783\" (UID: \"69a595e1-ca3b-4932-b9b6-c1c0a237a783\") " Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.571103 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa5400c-5bc3-4cd5-849d-87105da3827b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2aa5400c-5bc3-4cd5-849d-87105da3827b" (UID: "2aa5400c-5bc3-4cd5-849d-87105da3827b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.571958 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69a595e1-ca3b-4932-b9b6-c1c0a237a783-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69a595e1-ca3b-4932-b9b6-c1c0a237a783" (UID: "69a595e1-ca3b-4932-b9b6-c1c0a237a783"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.576977 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa5400c-5bc3-4cd5-849d-87105da3827b-kube-api-access-c2npr" (OuterVolumeSpecName: "kube-api-access-c2npr") pod "2aa5400c-5bc3-4cd5-849d-87105da3827b" (UID: "2aa5400c-5bc3-4cd5-849d-87105da3827b"). InnerVolumeSpecName "kube-api-access-c2npr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.583284 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a595e1-ca3b-4932-b9b6-c1c0a237a783-kube-api-access-hcpcv" (OuterVolumeSpecName: "kube-api-access-hcpcv") pod "69a595e1-ca3b-4932-b9b6-c1c0a237a783" (UID: "69a595e1-ca3b-4932-b9b6-c1c0a237a783"). InnerVolumeSpecName "kube-api-access-hcpcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.671822 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxbhr\" (UniqueName: \"kubernetes.io/projected/5ec5e94e-bd18-444a-9340-de9b41934458-kube-api-access-wxbhr\") pod \"5ec5e94e-bd18-444a-9340-de9b41934458\" (UID: \"5ec5e94e-bd18-444a-9340-de9b41934458\") " Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.672189 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec5e94e-bd18-444a-9340-de9b41934458-operator-scripts\") pod \"5ec5e94e-bd18-444a-9340-de9b41934458\" (UID: \"5ec5e94e-bd18-444a-9340-de9b41934458\") " Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.672293 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e50fc1-0b61-44f4-922a-acb08efb0796-operator-scripts\") pod \"21e50fc1-0b61-44f4-922a-acb08efb0796\" (UID: \"21e50fc1-0b61-44f4-922a-acb08efb0796\") " Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.672425 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxfpm\" (UniqueName: \"kubernetes.io/projected/552cda96-d016-4ff4-9bc2-9cf835b31dfe-kube-api-access-wxfpm\") pod \"552cda96-d016-4ff4-9bc2-9cf835b31dfe\" (UID: \"552cda96-d016-4ff4-9bc2-9cf835b31dfe\") " Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.672560 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2qrq\" (UniqueName: \"kubernetes.io/projected/21e50fc1-0b61-44f4-922a-acb08efb0796-kube-api-access-t2qrq\") pod \"21e50fc1-0b61-44f4-922a-acb08efb0796\" (UID: \"21e50fc1-0b61-44f4-922a-acb08efb0796\") " Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.672941 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa5400c-5bc3-4cd5-849d-87105da3827b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.673027 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c2npr\" (UniqueName: \"kubernetes.io/projected/2aa5400c-5bc3-4cd5-849d-87105da3827b-kube-api-access-c2npr\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.673092 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69a595e1-ca3b-4932-b9b6-c1c0a237a783-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.673149 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hcpcv\" (UniqueName: \"kubernetes.io/projected/69a595e1-ca3b-4932-b9b6-c1c0a237a783-kube-api-access-hcpcv\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.673902 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec5e94e-bd18-444a-9340-de9b41934458-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ec5e94e-bd18-444a-9340-de9b41934458" (UID: "5ec5e94e-bd18-444a-9340-de9b41934458"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.674098 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e50fc1-0b61-44f4-922a-acb08efb0796-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21e50fc1-0b61-44f4-922a-acb08efb0796" (UID: "21e50fc1-0b61-44f4-922a-acb08efb0796"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.676359 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e50fc1-0b61-44f4-922a-acb08efb0796-kube-api-access-t2qrq" (OuterVolumeSpecName: "kube-api-access-t2qrq") pod "21e50fc1-0b61-44f4-922a-acb08efb0796" (UID: "21e50fc1-0b61-44f4-922a-acb08efb0796"). InnerVolumeSpecName "kube-api-access-t2qrq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.676845 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec5e94e-bd18-444a-9340-de9b41934458-kube-api-access-wxbhr" (OuterVolumeSpecName: "kube-api-access-wxbhr") pod "5ec5e94e-bd18-444a-9340-de9b41934458" (UID: "5ec5e94e-bd18-444a-9340-de9b41934458"). InnerVolumeSpecName "kube-api-access-wxbhr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.677841 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552cda96-d016-4ff4-9bc2-9cf835b31dfe-kube-api-access-wxfpm" (OuterVolumeSpecName: "kube-api-access-wxfpm") pod "552cda96-d016-4ff4-9bc2-9cf835b31dfe" (UID: "552cda96-d016-4ff4-9bc2-9cf835b31dfe"). InnerVolumeSpecName "kube-api-access-wxfpm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.699070 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b4dccff87-s64b2"] Mar 12 17:08:08 crc kubenswrapper[5184]: W0312 17:08:08.699426 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb11174d1_7d67_495b_b0fd_8c9be89452a8.slice/crio-efc019269bf0c866ca400c8f1c777de1b2563517b03791f7b01d7731ea57671e WatchSource:0}: Error finding container efc019269bf0c866ca400c8f1c777de1b2563517b03791f7b01d7731ea57671e: Status 404 returned error can't find the container with id efc019269bf0c866ca400c8f1c777de1b2563517b03791f7b01d7731ea57671e Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.774946 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ec5e94e-bd18-444a-9340-de9b41934458-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.774986 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21e50fc1-0b61-44f4-922a-acb08efb0796-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.774996 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wxfpm\" (UniqueName: \"kubernetes.io/projected/552cda96-d016-4ff4-9bc2-9cf835b31dfe-kube-api-access-wxfpm\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.775008 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t2qrq\" (UniqueName: \"kubernetes.io/projected/21e50fc1-0b61-44f4-922a-acb08efb0796-kube-api-access-t2qrq\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:08 crc kubenswrapper[5184]: I0312 17:08:08.775016 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wxbhr\" (UniqueName: \"kubernetes.io/projected/5ec5e94e-bd18-444a-9340-de9b41934458-kube-api-access-wxbhr\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.478131 5184 generic.go:358] "Generic (PLEG): container finished" podID="b11174d1-7d67-495b-b0fd-8c9be89452a8" containerID="1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104" exitCode=0 Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.478752 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" event={"ID":"b11174d1-7d67-495b-b0fd-8c9be89452a8","Type":"ContainerDied","Data":"1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104"} Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.478793 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" event={"ID":"b11174d1-7d67-495b-b0fd-8c9be89452a8","Type":"ContainerStarted","Data":"efc019269bf0c866ca400c8f1c777de1b2563517b03791f7b01d7731ea57671e"} Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.484430 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nkwm" event={"ID":"f64b215c-973c-4761-9b13-0510387973ee","Type":"ContainerStarted","Data":"7947b4579198bd3277c8220ef4c1fdd253647feb0f12ccb500e53f81aa05d8dc"} Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.484762 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-66a8-account-create-update-8jj7v" Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.485220 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-571d-account-create-update-n2749" Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.491992 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2mwtn" Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.492273 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555588-xpxvn" Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.555459 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4nkwm" podStartSLOduration=3.556408931 podStartE2EDuration="8.555432283s" podCreationTimestamp="2026-03-12 17:08:01 +0000 UTC" firstStartedPulling="2026-03-12 17:08:03.310821319 +0000 UTC m=+1025.852132658" lastFinishedPulling="2026-03-12 17:08:08.309844631 +0000 UTC m=+1030.851156010" observedRunningTime="2026-03-12 17:08:09.539778191 +0000 UTC m=+1032.081089550" watchObservedRunningTime="2026-03-12 17:08:09.555432283 +0000 UTC m=+1032.096743632" Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.629669 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555582-vm6nb"] Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.635715 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555582-vm6nb"] Mar 12 17:08:09 crc kubenswrapper[5184]: I0312 17:08:09.932140 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7fxbs" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.102953 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-config-data\") pod \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.103060 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-db-sync-config-data\") pod \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.103106 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz6ln\" (UniqueName: \"kubernetes.io/projected/2f4b74dc-78a2-4b8c-8b52-cb972e894961-kube-api-access-hz6ln\") pod \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.103247 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-combined-ca-bundle\") pod \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\" (UID: \"2f4b74dc-78a2-4b8c-8b52-cb972e894961\") " Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.108559 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2f4b74dc-78a2-4b8c-8b52-cb972e894961" (UID: "2f4b74dc-78a2-4b8c-8b52-cb972e894961"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.110167 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f4b74dc-78a2-4b8c-8b52-cb972e894961-kube-api-access-hz6ln" (OuterVolumeSpecName: "kube-api-access-hz6ln") pod "2f4b74dc-78a2-4b8c-8b52-cb972e894961" (UID: "2f4b74dc-78a2-4b8c-8b52-cb972e894961"). InnerVolumeSpecName "kube-api-access-hz6ln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.149439 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f4b74dc-78a2-4b8c-8b52-cb972e894961" (UID: "2f4b74dc-78a2-4b8c-8b52-cb972e894961"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.177556 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-config-data" (OuterVolumeSpecName: "config-data") pod "2f4b74dc-78a2-4b8c-8b52-cb972e894961" (UID: "2f4b74dc-78a2-4b8c-8b52-cb972e894961"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.205448 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.205484 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.205513 5184 reconciler_common.go:299] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4b74dc-78a2-4b8c-8b52-cb972e894961-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.205529 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hz6ln\" (UniqueName: \"kubernetes.io/projected/2f4b74dc-78a2-4b8c-8b52-cb972e894961-kube-api-access-hz6ln\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.413228 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92eba052-9e58-4d2a-a414-529b609345da" path="/var/lib/kubelet/pods/92eba052-9e58-4d2a-a414-529b609345da/volumes" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.502778 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7fxbs" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.502988 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7fxbs" event={"ID":"2f4b74dc-78a2-4b8c-8b52-cb972e894961","Type":"ContainerDied","Data":"4514b87ab42560bca1cb8057ec6b8e6888c8365e7b8f1271c699a248bb073ae7"} Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.503054 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4514b87ab42560bca1cb8057ec6b8e6888c8365e7b8f1271c699a248bb073ae7" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.515586 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" event={"ID":"b11174d1-7d67-495b-b0fd-8c9be89452a8","Type":"ContainerStarted","Data":"f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be"} Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.515649 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.555929 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" podStartSLOduration=5.55591085 podStartE2EDuration="5.55591085s" podCreationTimestamp="2026-03-12 17:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:10.548723303 +0000 UTC m=+1033.090034652" watchObservedRunningTime="2026-03-12 17:08:10.55591085 +0000 UTC m=+1033.097222199" Mar 12 17:08:10 crc kubenswrapper[5184]: I0312 17:08:10.956779 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b4dccff87-s64b2"] Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.041766 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b9d876767-jcfwx"] Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042715 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21e50fc1-0b61-44f4-922a-acb08efb0796" containerName="mariadb-database-create" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042732 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e50fc1-0b61-44f4-922a-acb08efb0796" containerName="mariadb-database-create" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042744 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2aa5400c-5bc3-4cd5-849d-87105da3827b" containerName="mariadb-account-create-update" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042753 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa5400c-5bc3-4cd5-849d-87105da3827b" containerName="mariadb-account-create-update" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042785 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f4b74dc-78a2-4b8c-8b52-cb972e894961" containerName="glance-db-sync" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042791 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4b74dc-78a2-4b8c-8b52-cb972e894961" containerName="glance-db-sync" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042805 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69a595e1-ca3b-4932-b9b6-c1c0a237a783" containerName="mariadb-account-create-update" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042811 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a595e1-ca3b-4932-b9b6-c1c0a237a783" containerName="mariadb-account-create-update" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042822 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ec5e94e-bd18-444a-9340-de9b41934458" containerName="mariadb-account-create-update" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042827 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec5e94e-bd18-444a-9340-de9b41934458" containerName="mariadb-account-create-update" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042835 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="552cda96-d016-4ff4-9bc2-9cf835b31dfe" containerName="oc" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.042842 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="552cda96-d016-4ff4-9bc2-9cf835b31dfe" containerName="oc" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.043008 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="552cda96-d016-4ff4-9bc2-9cf835b31dfe" containerName="oc" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.043025 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="21e50fc1-0b61-44f4-922a-acb08efb0796" containerName="mariadb-database-create" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.043038 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f4b74dc-78a2-4b8c-8b52-cb972e894961" containerName="glance-db-sync" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.043049 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="69a595e1-ca3b-4932-b9b6-c1c0a237a783" containerName="mariadb-account-create-update" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.043059 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ec5e94e-bd18-444a-9340-de9b41934458" containerName="mariadb-account-create-update" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.043076 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="2aa5400c-5bc3-4cd5-849d-87105da3827b" containerName="mariadb-account-create-update" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.047025 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.077546 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9d876767-jcfwx"] Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.225865 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-nb\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.226051 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-swift-storage-0\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.226120 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-config\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.226181 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-sb\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.226278 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-svc\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.226334 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k6g4\" (UniqueName: \"kubernetes.io/projected/65a92d9a-7cc8-4e94-969e-7f31ed20c416-kube-api-access-9k6g4\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.328022 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-svc\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.328095 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k6g4\" (UniqueName: \"kubernetes.io/projected/65a92d9a-7cc8-4e94-969e-7f31ed20c416-kube-api-access-9k6g4\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.328141 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-nb\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.328219 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-swift-storage-0\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.328244 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-config\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.328272 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-sb\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.328841 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-svc\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.329053 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-nb\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.329164 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-config\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.329328 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-swift-storage-0\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.329357 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-sb\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.353716 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k6g4\" (UniqueName: \"kubernetes.io/projected/65a92d9a-7cc8-4e94-969e-7f31ed20c416-kube-api-access-9k6g4\") pod \"dnsmasq-dns-b9d876767-jcfwx\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.370257 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.536026 5184 generic.go:358] "Generic (PLEG): container finished" podID="f64b215c-973c-4761-9b13-0510387973ee" containerID="7947b4579198bd3277c8220ef4c1fdd253647feb0f12ccb500e53f81aa05d8dc" exitCode=0 Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.536103 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nkwm" event={"ID":"f64b215c-973c-4761-9b13-0510387973ee","Type":"ContainerDied","Data":"7947b4579198bd3277c8220ef4c1fdd253647feb0f12ccb500e53f81aa05d8dc"} Mar 12 17:08:11 crc kubenswrapper[5184]: I0312 17:08:11.828593 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9d876767-jcfwx"] Mar 12 17:08:12 crc kubenswrapper[5184]: I0312 17:08:12.551470 5184 generic.go:358] "Generic (PLEG): container finished" podID="65a92d9a-7cc8-4e94-969e-7f31ed20c416" containerID="3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710" exitCode=0 Mar 12 17:08:12 crc kubenswrapper[5184]: I0312 17:08:12.551524 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" event={"ID":"65a92d9a-7cc8-4e94-969e-7f31ed20c416","Type":"ContainerDied","Data":"3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710"} Mar 12 17:08:12 crc kubenswrapper[5184]: I0312 17:08:12.551877 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" event={"ID":"65a92d9a-7cc8-4e94-969e-7f31ed20c416","Type":"ContainerStarted","Data":"af6414343bfe44ffaf9d55d0b0dfc668dd44808c2b7cd30f22d810d4d7baa938"} Mar 12 17:08:12 crc kubenswrapper[5184]: I0312 17:08:12.552211 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" podUID="b11174d1-7d67-495b-b0fd-8c9be89452a8" containerName="dnsmasq-dns" containerID="cri-o://f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be" gracePeriod=10 Mar 12 17:08:12 crc kubenswrapper[5184]: I0312 17:08:12.918136 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:12 crc kubenswrapper[5184]: I0312 17:08:12.963076 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-config-data\") pod \"f64b215c-973c-4761-9b13-0510387973ee\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " Mar 12 17:08:12 crc kubenswrapper[5184]: I0312 17:08:12.963131 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-combined-ca-bundle\") pod \"f64b215c-973c-4761-9b13-0510387973ee\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " Mar 12 17:08:12 crc kubenswrapper[5184]: I0312 17:08:12.963171 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6nll\" (UniqueName: \"kubernetes.io/projected/f64b215c-973c-4761-9b13-0510387973ee-kube-api-access-h6nll\") pod \"f64b215c-973c-4761-9b13-0510387973ee\" (UID: \"f64b215c-973c-4761-9b13-0510387973ee\") " Mar 12 17:08:12 crc kubenswrapper[5184]: I0312 17:08:12.989820 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64b215c-973c-4761-9b13-0510387973ee-kube-api-access-h6nll" (OuterVolumeSpecName: "kube-api-access-h6nll") pod "f64b215c-973c-4761-9b13-0510387973ee" (UID: "f64b215c-973c-4761-9b13-0510387973ee"). InnerVolumeSpecName "kube-api-access-h6nll". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.005548 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f64b215c-973c-4761-9b13-0510387973ee" (UID: "f64b215c-973c-4761-9b13-0510387973ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.027021 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-config-data" (OuterVolumeSpecName: "config-data") pod "f64b215c-973c-4761-9b13-0510387973ee" (UID: "f64b215c-973c-4761-9b13-0510387973ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.028765 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.065145 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-config\") pod \"b11174d1-7d67-495b-b0fd-8c9be89452a8\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.065217 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-swift-storage-0\") pod \"b11174d1-7d67-495b-b0fd-8c9be89452a8\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.065239 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-sb\") pod \"b11174d1-7d67-495b-b0fd-8c9be89452a8\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.065301 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-svc\") pod \"b11174d1-7d67-495b-b0fd-8c9be89452a8\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.065350 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q29b4\" (UniqueName: \"kubernetes.io/projected/b11174d1-7d67-495b-b0fd-8c9be89452a8-kube-api-access-q29b4\") pod \"b11174d1-7d67-495b-b0fd-8c9be89452a8\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.065503 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-nb\") pod \"b11174d1-7d67-495b-b0fd-8c9be89452a8\" (UID: \"b11174d1-7d67-495b-b0fd-8c9be89452a8\") " Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.065812 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.065833 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64b215c-973c-4761-9b13-0510387973ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.065845 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h6nll\" (UniqueName: \"kubernetes.io/projected/f64b215c-973c-4761-9b13-0510387973ee-kube-api-access-h6nll\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.068942 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11174d1-7d67-495b-b0fd-8c9be89452a8-kube-api-access-q29b4" (OuterVolumeSpecName: "kube-api-access-q29b4") pod "b11174d1-7d67-495b-b0fd-8c9be89452a8" (UID: "b11174d1-7d67-495b-b0fd-8c9be89452a8"). InnerVolumeSpecName "kube-api-access-q29b4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.112414 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b11174d1-7d67-495b-b0fd-8c9be89452a8" (UID: "b11174d1-7d67-495b-b0fd-8c9be89452a8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.119746 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b11174d1-7d67-495b-b0fd-8c9be89452a8" (UID: "b11174d1-7d67-495b-b0fd-8c9be89452a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.122612 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b11174d1-7d67-495b-b0fd-8c9be89452a8" (UID: "b11174d1-7d67-495b-b0fd-8c9be89452a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.127433 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-config" (OuterVolumeSpecName: "config") pod "b11174d1-7d67-495b-b0fd-8c9be89452a8" (UID: "b11174d1-7d67-495b-b0fd-8c9be89452a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.127857 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b11174d1-7d67-495b-b0fd-8c9be89452a8" (UID: "b11174d1-7d67-495b-b0fd-8c9be89452a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.166630 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.166660 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.166672 5184 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.166680 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.166688 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b11174d1-7d67-495b-b0fd-8c9be89452a8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.166696 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q29b4\" (UniqueName: \"kubernetes.io/projected/b11174d1-7d67-495b-b0fd-8c9be89452a8-kube-api-access-q29b4\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.567566 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" event={"ID":"65a92d9a-7cc8-4e94-969e-7f31ed20c416","Type":"ContainerStarted","Data":"23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb"} Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.568113 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.571445 5184 generic.go:358] "Generic (PLEG): container finished" podID="b11174d1-7d67-495b-b0fd-8c9be89452a8" containerID="f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be" exitCode=0 Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.571641 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" event={"ID":"b11174d1-7d67-495b-b0fd-8c9be89452a8","Type":"ContainerDied","Data":"f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be"} Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.571712 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" event={"ID":"b11174d1-7d67-495b-b0fd-8c9be89452a8","Type":"ContainerDied","Data":"efc019269bf0c866ca400c8f1c777de1b2563517b03791f7b01d7731ea57671e"} Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.571656 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4dccff87-s64b2" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.571749 5184 scope.go:117] "RemoveContainer" containerID="f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.578133 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4nkwm" event={"ID":"f64b215c-973c-4761-9b13-0510387973ee","Type":"ContainerDied","Data":"671a99e947a75dfbe8efd29b0501052a9fbdf3e010c2c303e3350031553c6bb8"} Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.578213 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="671a99e947a75dfbe8efd29b0501052a9fbdf3e010c2c303e3350031553c6bb8" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.578285 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4nkwm" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.610142 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" podStartSLOduration=3.610115998 podStartE2EDuration="3.610115998s" podCreationTimestamp="2026-03-12 17:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:13.602109606 +0000 UTC m=+1036.143420975" watchObservedRunningTime="2026-03-12 17:08:13.610115998 +0000 UTC m=+1036.151427337" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.620166 5184 scope.go:117] "RemoveContainer" containerID="1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.645428 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b4dccff87-s64b2"] Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.655137 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b4dccff87-s64b2"] Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.667662 5184 scope.go:117] "RemoveContainer" containerID="f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be" Mar 12 17:08:13 crc kubenswrapper[5184]: E0312 17:08:13.669105 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be\": container with ID starting with f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be not found: ID does not exist" containerID="f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.669177 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be"} err="failed to get container status \"f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be\": rpc error: code = NotFound desc = could not find container \"f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be\": container with ID starting with f0a0baa2408a1fbac120d541b90397b1f27cddb86109e91cc19eb33eecc627be not found: ID does not exist" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.669216 5184 scope.go:117] "RemoveContainer" containerID="1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104" Mar 12 17:08:13 crc kubenswrapper[5184]: E0312 17:08:13.669717 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104\": container with ID starting with 1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104 not found: ID does not exist" containerID="1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.669820 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104"} err="failed to get container status \"1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104\": rpc error: code = NotFound desc = could not find container \"1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104\": container with ID starting with 1d8538ddfcc6712f362385a11849845cf0dd63ae7f5b2416198aa63ce5838104 not found: ID does not exist" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.847747 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9d876767-jcfwx"] Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.878103 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5bzsl"] Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.879524 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b11174d1-7d67-495b-b0fd-8c9be89452a8" containerName="init" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.879689 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11174d1-7d67-495b-b0fd-8c9be89452a8" containerName="init" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.879810 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f64b215c-973c-4761-9b13-0510387973ee" containerName="keystone-db-sync" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.879887 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64b215c-973c-4761-9b13-0510387973ee" containerName="keystone-db-sync" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.879983 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b11174d1-7d67-495b-b0fd-8c9be89452a8" containerName="dnsmasq-dns" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.880061 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11174d1-7d67-495b-b0fd-8c9be89452a8" containerName="dnsmasq-dns" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.880336 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="f64b215c-973c-4761-9b13-0510387973ee" containerName="keystone-db-sync" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.880549 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="b11174d1-7d67-495b-b0fd-8c9be89452a8" containerName="dnsmasq-dns" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.892506 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.896568 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5bzsl"] Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.898858 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"osp-secret\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.899190 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.898970 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.899583 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.899843 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-4s8pv\"" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.907451 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-657854d787-pks8w"] Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.930028 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.957682 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-657854d787-pks8w"] Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990049 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-config\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990129 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-config-data\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990158 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-sb\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990196 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t858d\" (UniqueName: \"kubernetes.io/projected/62f66fc0-329e-4a90-92d8-2af29474e66c-kube-api-access-t858d\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990264 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-swift-storage-0\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990314 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-credential-keys\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990338 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx775\" (UniqueName: \"kubernetes.io/projected/0a764aff-f5dc-49da-a606-84cbabca2db3-kube-api-access-bx775\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990414 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-nb\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990436 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-fernet-keys\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990463 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-combined-ca-bundle\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990491 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-scripts\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:13 crc kubenswrapper[5184]: I0312 17:08:13.990542 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-svc\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.063145 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-gzjgn"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.072843 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.074901 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gzjgn"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.075425 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-cinder-dockercfg-h494p\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.087805 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-config-data\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.089155 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scripts\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091632 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-db-sync-config-data\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091696 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-credential-keys\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091723 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx775\" (UniqueName: \"kubernetes.io/projected/0a764aff-f5dc-49da-a606-84cbabca2db3-kube-api-access-bx775\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091746 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-config-data\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091788 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-etc-machine-id\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091825 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-scripts\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091848 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-nb\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091870 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-fernet-keys\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091897 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-combined-ca-bundle\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091923 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8mz5\" (UniqueName: \"kubernetes.io/projected/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-kube-api-access-s8mz5\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.091957 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-scripts\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.092016 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-svc\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.092070 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-config\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.092135 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-config-data\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.092164 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-sb\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.092201 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t858d\" (UniqueName: \"kubernetes.io/projected/62f66fc0-329e-4a90-92d8-2af29474e66c-kube-api-access-t858d\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.092962 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-svc\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.094083 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-swift-storage-0\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.094194 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-combined-ca-bundle\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.094988 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-sb\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.095513 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-nb\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.096223 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-swift-storage-0\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.098109 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-config\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.105561 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-config-data\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.113213 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-scripts\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.118372 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-fernet-keys\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.119720 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-credential-keys\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.120101 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-combined-ca-bundle\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.144428 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/horizon-85fc5cfbb9-trxq5"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.150106 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx775\" (UniqueName: \"kubernetes.io/projected/0a764aff-f5dc-49da-a606-84cbabca2db3-kube-api-access-bx775\") pod \"keystone-bootstrap-5bzsl\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.163712 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.177129 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"horizon\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.177423 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"horizon-scripts\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.178153 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"horizon-horizon-dockercfg-nk472\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.178353 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"horizon-config-data\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.180069 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t858d\" (UniqueName: \"kubernetes.io/projected/62f66fc0-329e-4a90-92d8-2af29474e66c-kube-api-access-t858d\") pod \"dnsmasq-dns-657854d787-pks8w\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.182450 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85fc5cfbb9-trxq5"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195665 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp67s\" (UniqueName: \"kubernetes.io/projected/646cdc1b-863a-4b58-8869-fcbc386a96e2-kube-api-access-cp67s\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195742 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-combined-ca-bundle\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195766 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-db-sync-config-data\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195797 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-config-data\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195824 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/646cdc1b-863a-4b58-8869-fcbc386a96e2-logs\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195844 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-etc-machine-id\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195871 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-scripts\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195897 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8mz5\" (UniqueName: \"kubernetes.io/projected/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-kube-api-access-s8mz5\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195911 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/646cdc1b-863a-4b58-8869-fcbc386a96e2-horizon-secret-key\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195950 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-scripts\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.195973 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-config-data\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.197479 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-etc-machine-id\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.212406 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-config-data\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.215129 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.217359 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-combined-ca-bundle\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.223139 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-db-sync-config-data\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.225042 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-scripts\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.238165 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8mz5\" (UniqueName: \"kubernetes.io/projected/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-kube-api-access-s8mz5\") pod \"cinder-db-sync-gzjgn\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.255932 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-jb66b"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.262213 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.272427 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.282603 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-config-data\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.282817 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-barbican-dockercfg-gsb8b\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.292549 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vtzf9"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.298947 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/646cdc1b-863a-4b58-8869-fcbc386a96e2-logs\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.299261 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wntmd\" (UniqueName: \"kubernetes.io/projected/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-kube-api-access-wntmd\") pod \"barbican-db-sync-jb66b\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.299413 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-db-sync-config-data\") pod \"barbican-db-sync-jb66b\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.299502 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-combined-ca-bundle\") pod \"barbican-db-sync-jb66b\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.299635 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/646cdc1b-863a-4b58-8869-fcbc386a96e2-horizon-secret-key\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.299784 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-scripts\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.299908 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-config-data\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.300064 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp67s\" (UniqueName: \"kubernetes.io/projected/646cdc1b-863a-4b58-8869-fcbc386a96e2-kube-api-access-cp67s\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.301126 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/646cdc1b-863a-4b58-8869-fcbc386a96e2-logs\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.301800 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-scripts\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.302160 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-config-data\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.309333 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/646cdc1b-863a-4b58-8869-fcbc386a96e2-horizon-secret-key\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.318113 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.323804 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-neutron-dockercfg-pd56m\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.324069 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-httpd-config\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.324276 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-config\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.342644 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jb66b"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.344134 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp67s\" (UniqueName: \"kubernetes.io/projected/646cdc1b-863a-4b58-8869-fcbc386a96e2-kube-api-access-cp67s\") pod \"horizon-85fc5cfbb9-trxq5\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.361527 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vtzf9"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.371610 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-657854d787-pks8w"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.394210 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mvj44"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.420599 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-config\") pod \"neutron-db-sync-vtzf9\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.420871 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5qtj\" (UniqueName: \"kubernetes.io/projected/59bc23c2-fe37-43e7-a1a9-2830892902bf-kube-api-access-t5qtj\") pod \"neutron-db-sync-vtzf9\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.420960 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-combined-ca-bundle\") pod \"neutron-db-sync-vtzf9\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.421081 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wntmd\" (UniqueName: \"kubernetes.io/projected/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-kube-api-access-wntmd\") pod \"barbican-db-sync-jb66b\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.421137 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-db-sync-config-data\") pod \"barbican-db-sync-jb66b\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.421165 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-combined-ca-bundle\") pod \"barbican-db-sync-jb66b\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.464156 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wntmd\" (UniqueName: \"kubernetes.io/projected/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-kube-api-access-wntmd\") pod \"barbican-db-sync-jb66b\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.479243 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.488821 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-combined-ca-bundle\") pod \"barbican-db-sync-jb66b\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.495860 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-db-sync-config-data\") pod \"barbican-db-sync-jb66b\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.496102 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-placement-dockercfg-rp5f5\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.496176 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-scripts\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.502862 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-config-data\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.522446 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-config\") pod \"neutron-db-sync-vtzf9\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.522721 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dgzq\" (UniqueName: \"kubernetes.io/projected/a9bd8488-49bb-48df-8f41-f415f71a2834-kube-api-access-4dgzq\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.522756 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-scripts\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.522796 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-combined-ca-bundle\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.522814 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5qtj\" (UniqueName: \"kubernetes.io/projected/59bc23c2-fe37-43e7-a1a9-2830892902bf-kube-api-access-t5qtj\") pod \"neutron-db-sync-vtzf9\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.522844 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9bd8488-49bb-48df-8f41-f415f71a2834-logs\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.522863 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-combined-ca-bundle\") pod \"neutron-db-sync-vtzf9\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.522897 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-config-data\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.525307 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11174d1-7d67-495b-b0fd-8c9be89452a8" path="/var/lib/kubelet/pods/b11174d1-7d67-495b-b0fd-8c9be89452a8/volumes" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.526136 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/horizon-844b468785-jt9ph"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.528033 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-config\") pod \"neutron-db-sync-vtzf9\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.532042 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-combined-ca-bundle\") pod \"neutron-db-sync-vtzf9\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.537254 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.539650 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mvj44"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.539684 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844b468785-jt9ph"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.539698 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69f4d98b5f-9pzj8"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.543514 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.563578 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.565356 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.576426 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5qtj\" (UniqueName: \"kubernetes.io/projected/59bc23c2-fe37-43e7-a1a9-2830892902bf-kube-api-access-t5qtj\") pod \"neutron-db-sync-vtzf9\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.625014 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad645834-0761-42a0-8bf0-dd763b829aac-horizon-secret-key\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.625077 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-nb\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.625183 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-swift-storage-0\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626259 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4vlh\" (UniqueName: \"kubernetes.io/projected/2729f29f-9520-421a-bb45-917ca4cef6fc-kube-api-access-g4vlh\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626309 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dgzq\" (UniqueName: \"kubernetes.io/projected/a9bd8488-49bb-48df-8f41-f415f71a2834-kube-api-access-4dgzq\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626352 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-sb\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626428 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-scripts\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626457 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-scripts\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626494 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-config\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626516 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad645834-0761-42a0-8bf0-dd763b829aac-logs\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626595 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-combined-ca-bundle\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626657 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8phq\" (UniqueName: \"kubernetes.io/projected/ad645834-0761-42a0-8bf0-dd763b829aac-kube-api-access-v8phq\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626694 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-config-data\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.626716 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9bd8488-49bb-48df-8f41-f415f71a2834-logs\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.628136 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-svc\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.628257 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-config-data\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.628264 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9bd8488-49bb-48df-8f41-f415f71a2834-logs\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.628874 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69f4d98b5f-9pzj8"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.629668 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jb66b" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.633339 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-combined-ca-bundle\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.635806 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-scripts\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.643661 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.644421 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-config-data\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.657703 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.657745 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dgzq\" (UniqueName: \"kubernetes.io/projected/a9bd8488-49bb-48df-8f41-f415f71a2834-kube-api-access-4dgzq\") pod \"placement-db-sync-mvj44\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.672570 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.673585 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.677562 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-glance-default-public-svc\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.678906 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-scripts\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.686583 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-glance-dockercfg-kvq4j\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.697688 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.697860 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.698438 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-external-config-data\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.702769 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.702981 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.729683 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-svc\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730166 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730199 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqtqq\" (UniqueName: \"kubernetes.io/projected/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-kube-api-access-pqtqq\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730312 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-log-httpd\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730347 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-logs\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730427 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad645834-0761-42a0-8bf0-dd763b829aac-horizon-secret-key\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730457 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-nb\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730483 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730560 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730604 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730691 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-swift-storage-0\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730719 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730783 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4vlh\" (UniqueName: \"kubernetes.io/projected/2729f29f-9520-421a-bb45-917ca4cef6fc-kube-api-access-g4vlh\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730823 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-config-data\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730852 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-config-data\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730875 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730898 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-sb\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730916 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-svc\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.730973 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq98p\" (UniqueName: \"kubernetes.io/projected/c9356417-fcb6-4b97-9f16-db63e667d6e8-kube-api-access-pq98p\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.731029 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-scripts\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.731061 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-scripts\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.731095 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-config\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.731112 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad645834-0761-42a0-8bf0-dd763b829aac-logs\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.731350 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-scripts\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.731426 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-run-httpd\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.731545 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8phq\" (UniqueName: \"kubernetes.io/projected/ad645834-0761-42a0-8bf0-dd763b829aac-kube-api-access-v8phq\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.731584 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-config-data\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.732184 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-nb\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.732677 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-scripts\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.732885 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-config\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.733038 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-config-data\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.734845 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-swift-storage-0\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.735977 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-sb\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.744681 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad645834-0761-42a0-8bf0-dd763b829aac-horizon-secret-key\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.745770 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.747648 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad645834-0761-42a0-8bf0-dd763b829aac-logs\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.755671 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4vlh\" (UniqueName: \"kubernetes.io/projected/2729f29f-9520-421a-bb45-917ca4cef6fc-kube-api-access-g4vlh\") pod \"dnsmasq-dns-69f4d98b5f-9pzj8\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.759279 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8phq\" (UniqueName: \"kubernetes.io/projected/ad645834-0761-42a0-8bf0-dd763b829aac-kube-api-access-v8phq\") pod \"horizon-844b468785-jt9ph\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.803312 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.815047 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.821631 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-glance-default-internal-svc\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.822637 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-internal-config-data\"" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834198 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-config-data\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834276 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834327 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834349 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834425 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834452 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834476 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834528 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-scripts\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834598 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-config-data\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834625 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-config-data\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834648 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834683 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pq98p\" (UniqueName: \"kubernetes.io/projected/c9356417-fcb6-4b97-9f16-db63e667d6e8-kube-api-access-pq98p\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834720 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-scripts\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834764 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpqmh\" (UniqueName: \"kubernetes.io/projected/88e09674-3ed0-4d73-bf8a-18fb1990c892-kube-api-access-wpqmh\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834794 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834821 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-scripts\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834857 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.834884 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-run-httpd\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.835487 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-run-httpd\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.836308 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.839666 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.840868 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.841215 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-logs\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.841293 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.841313 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqtqq\" (UniqueName: \"kubernetes.io/projected/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-kube-api-access-pqtqq\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.841428 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-log-httpd\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.843068 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-logs\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.843762 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-logs\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.846787 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-log-httpd\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.848000 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.848991 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.851811 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-config-data\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.855271 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-scripts\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.861294 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-scripts\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.862747 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.866022 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.874858 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq98p\" (UniqueName: \"kubernetes.io/projected/c9356417-fcb6-4b97-9f16-db63e667d6e8-kube-api-access-pq98p\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.879059 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqtqq\" (UniqueName: \"kubernetes.io/projected/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-kube-api-access-pqtqq\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.879114 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-config-data\") pod \"ceilometer-0\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " pod="openstack/ceilometer-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.902543 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.912093 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.926796 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.950215 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-config-data\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.950313 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.950336 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.950404 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-scripts\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.950516 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpqmh\" (UniqueName: \"kubernetes.io/projected/88e09674-3ed0-4d73-bf8a-18fb1990c892-kube-api-access-wpqmh\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.950543 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.950581 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.950652 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-logs\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.951233 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-logs\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.952403 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.953540 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.961147 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-scripts\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.961336 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.964448 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.964771 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.967028 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-config-data\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:14 crc kubenswrapper[5184]: I0312 17:08:14.982028 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpqmh\" (UniqueName: \"kubernetes.io/projected/88e09674-3ed0-4d73-bf8a-18fb1990c892-kube-api-access-wpqmh\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.019207 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dplgw"] Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.019792 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.037742 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.045292 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.080968 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dplgw"] Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.096740 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.153546 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-utilities\") pod \"community-operators-dplgw\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.153783 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-catalog-content\") pod \"community-operators-dplgw\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.153886 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz64q\" (UniqueName: \"kubernetes.io/projected/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-kube-api-access-rz64q\") pod \"community-operators-dplgw\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.160231 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5bzsl"] Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.184923 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.234767 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-657854d787-pks8w"] Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.258477 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-catalog-content\") pod \"community-operators-dplgw\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.258618 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz64q\" (UniqueName: \"kubernetes.io/projected/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-kube-api-access-rz64q\") pod \"community-operators-dplgw\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.259220 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-utilities\") pod \"community-operators-dplgw\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.260911 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-utilities\") pod \"community-operators-dplgw\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.261258 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-catalog-content\") pod \"community-operators-dplgw\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.310197 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz64q\" (UniqueName: \"kubernetes.io/projected/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-kube-api-access-rz64q\") pod \"community-operators-dplgw\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.342930 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85fc5cfbb9-trxq5"] Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.370503 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.707024 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5bzsl" event={"ID":"0a764aff-f5dc-49da-a606-84cbabca2db3","Type":"ContainerStarted","Data":"815ab43b87cabe34faa30eeb7ed5b4ef343c11259f5ba37b82958a2a7682297b"} Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.729248 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fc5cfbb9-trxq5" event={"ID":"646cdc1b-863a-4b58-8869-fcbc386a96e2","Type":"ContainerStarted","Data":"7d45979f4b81e06da62a9752ff1c6a0e7a4c65e5c6ff371738893ceafa67b541"} Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.745346 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" podUID="65a92d9a-7cc8-4e94-969e-7f31ed20c416" containerName="dnsmasq-dns" containerID="cri-o://23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb" gracePeriod=10 Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.745779 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657854d787-pks8w" event={"ID":"62f66fc0-329e-4a90-92d8-2af29474e66c","Type":"ContainerStarted","Data":"29ae178d7d56e872a27200c3086d267d4512e521e698636633ce8a8a497fbe91"} Mar 12 17:08:15 crc kubenswrapper[5184]: I0312 17:08:15.863775 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gzjgn"] Mar 12 17:08:15 crc kubenswrapper[5184]: W0312 17:08:15.884949 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ed3fdf4_b869_4ef8_b746_afe1e52fe286.slice/crio-9e1f2f35c99cea5d95fefacc3828695859439226470ffb365ced413e6f507687 WatchSource:0}: Error finding container 9e1f2f35c99cea5d95fefacc3828695859439226470ffb365ced413e6f507687: Status 404 returned error can't find the container with id 9e1f2f35c99cea5d95fefacc3828695859439226470ffb365ced413e6f507687 Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.110851 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z62x6"] Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.130624 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.143325 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z62x6"] Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.286016 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-catalog-content\") pod \"certified-operators-z62x6\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.286607 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9xjv\" (UniqueName: \"kubernetes.io/projected/3521e399-e317-459a-badc-0b4695197ac0-kube-api-access-n9xjv\") pod \"certified-operators-z62x6\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.286678 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-utilities\") pod \"certified-operators-z62x6\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.388662 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-utilities\") pod \"certified-operators-z62x6\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.388803 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-catalog-content\") pod \"certified-operators-z62x6\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.388878 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9xjv\" (UniqueName: \"kubernetes.io/projected/3521e399-e317-459a-badc-0b4695197ac0-kube-api-access-n9xjv\") pod \"certified-operators-z62x6\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.390094 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-utilities\") pod \"certified-operators-z62x6\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.396763 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vtzf9"] Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.390368 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-catalog-content\") pod \"certified-operators-z62x6\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.429362 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9xjv\" (UniqueName: \"kubernetes.io/projected/3521e399-e317-459a-badc-0b4695197ac0-kube-api-access-n9xjv\") pod \"certified-operators-z62x6\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.492601 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.554487 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jb66b"] Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.581038 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mvj44"] Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.604920 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69f4d98b5f-9pzj8"] Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.619890 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844b468785-jt9ph"] Mar 12 17:08:16 crc kubenswrapper[5184]: W0312 17:08:16.669328 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2729f29f_9520_421a_bb45_917ca4cef6fc.slice/crio-df0de2ae8b3120425039869498bfc49ae780249e053bedffacd1f85bbe62c106 WatchSource:0}: Error finding container df0de2ae8b3120425039869498bfc49ae780249e053bedffacd1f85bbe62c106: Status 404 returned error can't find the container with id df0de2ae8b3120425039869498bfc49ae780249e053bedffacd1f85bbe62c106 Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.670111 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.762725 5184 generic.go:358] "Generic (PLEG): container finished" podID="65a92d9a-7cc8-4e94-969e-7f31ed20c416" containerID="23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb" exitCode=0 Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.763288 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" event={"ID":"65a92d9a-7cc8-4e94-969e-7f31ed20c416","Type":"ContainerDied","Data":"23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb"} Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.763316 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" event={"ID":"65a92d9a-7cc8-4e94-969e-7f31ed20c416","Type":"ContainerDied","Data":"af6414343bfe44ffaf9d55d0b0dfc668dd44808c2b7cd30f22d810d4d7baa938"} Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.763333 5184 scope.go:117] "RemoveContainer" containerID="23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.763549 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9d876767-jcfwx" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.770754 5184 generic.go:358] "Generic (PLEG): container finished" podID="62f66fc0-329e-4a90-92d8-2af29474e66c" containerID="46133b6ac7d76f48a753facdcf31a865d6acf3f1e3c4a6cdcf1b5951539c6354" exitCode=0 Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.770907 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657854d787-pks8w" event={"ID":"62f66fc0-329e-4a90-92d8-2af29474e66c","Type":"ContainerDied","Data":"46133b6ac7d76f48a753facdcf31a865d6acf3f1e3c4a6cdcf1b5951539c6354"} Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.782840 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gzjgn" event={"ID":"7ed3fdf4-b869-4ef8-b746-afe1e52fe286","Type":"ContainerStarted","Data":"9e1f2f35c99cea5d95fefacc3828695859439226470ffb365ced413e6f507687"} Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.807317 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-config\") pod \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.807399 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-swift-storage-0\") pod \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.807426 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-svc\") pod \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.807471 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-nb\") pod \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.807495 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k6g4\" (UniqueName: \"kubernetes.io/projected/65a92d9a-7cc8-4e94-969e-7f31ed20c416-kube-api-access-9k6g4\") pod \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.807717 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-sb\") pod \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\" (UID: \"65a92d9a-7cc8-4e94-969e-7f31ed20c416\") " Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.818095 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" event={"ID":"2729f29f-9520-421a-bb45-917ca4cef6fc","Type":"ContainerStarted","Data":"df0de2ae8b3120425039869498bfc49ae780249e053bedffacd1f85bbe62c106"} Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.830099 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vtzf9" event={"ID":"59bc23c2-fe37-43e7-a1a9-2830892902bf","Type":"ContainerStarted","Data":"fe6b77bb72787b75021ac12606ee3a1cb7be757bda61a018f84879b452739ff1"} Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.834750 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a92d9a-7cc8-4e94-969e-7f31ed20c416-kube-api-access-9k6g4" (OuterVolumeSpecName: "kube-api-access-9k6g4") pod "65a92d9a-7cc8-4e94-969e-7f31ed20c416" (UID: "65a92d9a-7cc8-4e94-969e-7f31ed20c416"). InnerVolumeSpecName "kube-api-access-9k6g4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.839410 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5bzsl" event={"ID":"0a764aff-f5dc-49da-a606-84cbabca2db3","Type":"ContainerStarted","Data":"2b059a0618396fd4c44a921ac9e8a2ef55e86f0811d130bfc354b1031772e615"} Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.844278 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mvj44" event={"ID":"a9bd8488-49bb-48df-8f41-f415f71a2834","Type":"ContainerStarted","Data":"a9a8b329023b33530391e4abfb62ee455c571e9095db2e15d8ba23f3dcd5ebca"} Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.851478 5184 scope.go:117] "RemoveContainer" containerID="3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.856157 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b468785-jt9ph" event={"ID":"ad645834-0761-42a0-8bf0-dd763b829aac","Type":"ContainerStarted","Data":"2d4f8d974716a680f2dd7afdc80e320c44b685e551dd94303c2bf236e0a9d37e"} Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.882771 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dplgw"] Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.885986 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jb66b" event={"ID":"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c","Type":"ContainerStarted","Data":"392ea1029e9ff9d30e6ffb02723494350983ddbdde6b1005a78691790fee798e"} Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.891136 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.899791 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5bzsl" podStartSLOduration=3.899769409 podStartE2EDuration="3.899769409s" podCreationTimestamp="2026-03-12 17:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:16.885857521 +0000 UTC m=+1039.427168860" watchObservedRunningTime="2026-03-12 17:08:16.899769409 +0000 UTC m=+1039.441080748" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.918133 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9k6g4\" (UniqueName: \"kubernetes.io/projected/65a92d9a-7cc8-4e94-969e-7f31ed20c416-kube-api-access-9k6g4\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:16 crc kubenswrapper[5184]: I0312 17:08:16.968027 5184 scope.go:117] "RemoveContainer" containerID="23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb" Mar 12 17:08:17 crc kubenswrapper[5184]: E0312 17:08:17.077350 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb\": container with ID starting with 23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb not found: ID does not exist" containerID="23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.077434 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb"} err="failed to get container status \"23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb\": rpc error: code = NotFound desc = could not find container \"23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb\": container with ID starting with 23e87f1e1356b95e282fd08dd2ccc0d5c034e89393aa750a719ebd09d6f5bfbb not found: ID does not exist" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.077483 5184 scope.go:117] "RemoveContainer" containerID="3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.085593 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "65a92d9a-7cc8-4e94-969e-7f31ed20c416" (UID: "65a92d9a-7cc8-4e94-969e-7f31ed20c416"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.100164 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.110832 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:17 crc kubenswrapper[5184]: E0312 17:08:17.148761 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710\": container with ID starting with 3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710 not found: ID does not exist" containerID="3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.148853 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710"} err="failed to get container status \"3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710\": rpc error: code = NotFound desc = could not find container \"3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710\": container with ID starting with 3c379b5504a28648ab22dff78c26bf4177e09147cb6851bc6281f4476261a710 not found: ID does not exist" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.208410 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.217995 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-844b468785-jt9ph"] Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.231763 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.244175 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/horizon-66446fcd8f-lmflm"] Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.245222 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65a92d9a-7cc8-4e94-969e-7f31ed20c416" containerName="init" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.245239 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a92d9a-7cc8-4e94-969e-7f31ed20c416" containerName="init" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.245272 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65a92d9a-7cc8-4e94-969e-7f31ed20c416" containerName="dnsmasq-dns" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.245277 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a92d9a-7cc8-4e94-969e-7f31ed20c416" containerName="dnsmasq-dns" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.245454 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="65a92d9a-7cc8-4e94-969e-7f31ed20c416" containerName="dnsmasq-dns" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.270250 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66446fcd8f-lmflm"] Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.270360 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.315471 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.430623 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af22a969-32c4-4628-8667-be2162f7d92d-logs\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.430688 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/af22a969-32c4-4628-8667-be2162f7d92d-horizon-secret-key\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.430743 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-config-data\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.430799 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lz69\" (UniqueName: \"kubernetes.io/projected/af22a969-32c4-4628-8667-be2162f7d92d-kube-api-access-6lz69\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.430829 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-scripts\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.534108 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af22a969-32c4-4628-8667-be2162f7d92d-logs\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.534603 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/af22a969-32c4-4628-8667-be2162f7d92d-horizon-secret-key\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.534710 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-config-data\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.534809 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lz69\" (UniqueName: \"kubernetes.io/projected/af22a969-32c4-4628-8667-be2162f7d92d-kube-api-access-6lz69\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.534847 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-scripts\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.536346 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-scripts\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.538437 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-config-data\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.539292 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af22a969-32c4-4628-8667-be2162f7d92d-logs\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.567801 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/af22a969-32c4-4628-8667-be2162f7d92d-horizon-secret-key\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.570173 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lz69\" (UniqueName: \"kubernetes.io/projected/af22a969-32c4-4628-8667-be2162f7d92d-kube-api-access-6lz69\") pod \"horizon-66446fcd8f-lmflm\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.633205 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-config" (OuterVolumeSpecName: "config") pod "65a92d9a-7cc8-4e94-969e-7f31ed20c416" (UID: "65a92d9a-7cc8-4e94-969e-7f31ed20c416"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.641249 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.654720 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.664115 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "65a92d9a-7cc8-4e94-969e-7f31ed20c416" (UID: "65a92d9a-7cc8-4e94-969e-7f31ed20c416"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.694034 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "65a92d9a-7cc8-4e94-969e-7f31ed20c416" (UID: "65a92d9a-7cc8-4e94-969e-7f31ed20c416"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.715272 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "65a92d9a-7cc8-4e94-969e-7f31ed20c416" (UID: "65a92d9a-7cc8-4e94-969e-7f31ed20c416"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.749638 5184 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.749687 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.749699 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65a92d9a-7cc8-4e94-969e-7f31ed20c416-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.790126 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.821081 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z62x6"] Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.902978 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88e09674-3ed0-4d73-bf8a-18fb1990c892","Type":"ContainerStarted","Data":"8955c3057fe003ce0bb49873dec406ab7ae450a620f92d72ce46efed7c2dcc7d"} Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.924826 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dplgw" event={"ID":"d3f7d154-f90e-4731-bc04-00b13b3fbfd8","Type":"ContainerStarted","Data":"226f977bc3ccfc2db5f1e65ba18c28cd7276dacba624a7cc20ef0329a4d8f6dc"} Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.968314 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-657854d787-pks8w" event={"ID":"62f66fc0-329e-4a90-92d8-2af29474e66c","Type":"ContainerDied","Data":"29ae178d7d56e872a27200c3086d267d4512e521e698636633ce8a8a497fbe91"} Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.968652 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29ae178d7d56e872a27200c3086d267d4512e521e698636633ce8a8a497fbe91" Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.977540 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1","Type":"ContainerStarted","Data":"feb72b544bc8f663ca8265f1fa1967854078871065d0ee851bc7a90413f9e18a"} Mar 12 17:08:17 crc kubenswrapper[5184]: I0312 17:08:17.989503 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vtzf9" event={"ID":"59bc23c2-fe37-43e7-a1a9-2830892902bf","Type":"ContainerStarted","Data":"27fa142bf68698e87c9b60da45f68b7ca5810536c39991c80fd7de3a9b2f6aab"} Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.022097 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vtzf9" podStartSLOduration=4.022076013 podStartE2EDuration="4.022076013s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:18.014587657 +0000 UTC m=+1040.555899086" watchObservedRunningTime="2026-03-12 17:08:18.022076013 +0000 UTC m=+1040.563387352" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.089722 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b9d876767-jcfwx"] Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.107107 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b9d876767-jcfwx"] Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.108456 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.264254 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t858d\" (UniqueName: \"kubernetes.io/projected/62f66fc0-329e-4a90-92d8-2af29474e66c-kube-api-access-t858d\") pod \"62f66fc0-329e-4a90-92d8-2af29474e66c\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.264405 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-sb\") pod \"62f66fc0-329e-4a90-92d8-2af29474e66c\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.264444 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-nb\") pod \"62f66fc0-329e-4a90-92d8-2af29474e66c\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.264474 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-swift-storage-0\") pod \"62f66fc0-329e-4a90-92d8-2af29474e66c\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.264594 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-config\") pod \"62f66fc0-329e-4a90-92d8-2af29474e66c\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.264749 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-svc\") pod \"62f66fc0-329e-4a90-92d8-2af29474e66c\" (UID: \"62f66fc0-329e-4a90-92d8-2af29474e66c\") " Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.275225 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f66fc0-329e-4a90-92d8-2af29474e66c-kube-api-access-t858d" (OuterVolumeSpecName: "kube-api-access-t858d") pod "62f66fc0-329e-4a90-92d8-2af29474e66c" (UID: "62f66fc0-329e-4a90-92d8-2af29474e66c"). InnerVolumeSpecName "kube-api-access-t858d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.323998 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "62f66fc0-329e-4a90-92d8-2af29474e66c" (UID: "62f66fc0-329e-4a90-92d8-2af29474e66c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.328553 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "62f66fc0-329e-4a90-92d8-2af29474e66c" (UID: "62f66fc0-329e-4a90-92d8-2af29474e66c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.334027 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62f66fc0-329e-4a90-92d8-2af29474e66c" (UID: "62f66fc0-329e-4a90-92d8-2af29474e66c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.334887 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-config" (OuterVolumeSpecName: "config") pod "62f66fc0-329e-4a90-92d8-2af29474e66c" (UID: "62f66fc0-329e-4a90-92d8-2af29474e66c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.342213 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62f66fc0-329e-4a90-92d8-2af29474e66c" (UID: "62f66fc0-329e-4a90-92d8-2af29474e66c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.346759 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66446fcd8f-lmflm"] Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.366914 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.366964 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.366975 5184 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.366986 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.366998 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f66fc0-329e-4a90-92d8-2af29474e66c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.367007 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t858d\" (UniqueName: \"kubernetes.io/projected/62f66fc0-329e-4a90-92d8-2af29474e66c-kube-api-access-t858d\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:18 crc kubenswrapper[5184]: I0312 17:08:18.485748 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a92d9a-7cc8-4e94-969e-7f31ed20c416" path="/var/lib/kubelet/pods/65a92d9a-7cc8-4e94-969e-7f31ed20c416/volumes" Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.031312 5184 generic.go:358] "Generic (PLEG): container finished" podID="3521e399-e317-459a-badc-0b4695197ac0" containerID="0038de48bfb7823a498c503e67d491b8330d5ebe76c73c5b0a09920b91693006" exitCode=0 Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.031637 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z62x6" event={"ID":"3521e399-e317-459a-badc-0b4695197ac0","Type":"ContainerDied","Data":"0038de48bfb7823a498c503e67d491b8330d5ebe76c73c5b0a09920b91693006"} Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.031669 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z62x6" event={"ID":"3521e399-e317-459a-badc-0b4695197ac0","Type":"ContainerStarted","Data":"60354f4c6d032518cecb293d5051b3f5a0ef260c6cb21b2b8d5cd293cc48cd76"} Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.038630 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9356417-fcb6-4b97-9f16-db63e667d6e8","Type":"ContainerStarted","Data":"388bd2c7c9f9ca60bdec7bd29174a48d468c5b7ebc7672fbfab2464949c0c3ba"} Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.047465 5184 generic.go:358] "Generic (PLEG): container finished" podID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerID="a6e0291d5c44eb2683daec10e819257ab5a5da20b54a7ad474c2eabd2ca7c154" exitCode=0 Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.047815 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dplgw" event={"ID":"d3f7d154-f90e-4731-bc04-00b13b3fbfd8","Type":"ContainerDied","Data":"a6e0291d5c44eb2683daec10e819257ab5a5da20b54a7ad474c2eabd2ca7c154"} Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.056950 5184 generic.go:358] "Generic (PLEG): container finished" podID="2729f29f-9520-421a-bb45-917ca4cef6fc" containerID="97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494" exitCode=0 Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.057518 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" event={"ID":"2729f29f-9520-421a-bb45-917ca4cef6fc","Type":"ContainerDied","Data":"97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494"} Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.061549 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66446fcd8f-lmflm" event={"ID":"af22a969-32c4-4628-8667-be2162f7d92d","Type":"ContainerStarted","Data":"90fcc5f8ddc8521d0271d96fe5e07fead5353acba9275fd409c0d2bc4829103e"} Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.061762 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-657854d787-pks8w" Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.139726 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-657854d787-pks8w"] Mar 12 17:08:19 crc kubenswrapper[5184]: I0312 17:08:19.148426 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-657854d787-pks8w"] Mar 12 17:08:20 crc kubenswrapper[5184]: I0312 17:08:20.077412 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9356417-fcb6-4b97-9f16-db63e667d6e8","Type":"ContainerStarted","Data":"ffe630a7a8a312c739476d9d0a0f4304e1ccee911f46d1c7407ac8521a80f736"} Mar 12 17:08:20 crc kubenswrapper[5184]: I0312 17:08:20.082631 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" event={"ID":"2729f29f-9520-421a-bb45-917ca4cef6fc","Type":"ContainerStarted","Data":"ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb"} Mar 12 17:08:20 crc kubenswrapper[5184]: I0312 17:08:20.082779 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:20 crc kubenswrapper[5184]: I0312 17:08:20.086857 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88e09674-3ed0-4d73-bf8a-18fb1990c892","Type":"ContainerStarted","Data":"402c80b7ce4447bba77c99bd9396e8d000b4925ab7347e51387d1b1717e3ede8"} Mar 12 17:08:20 crc kubenswrapper[5184]: I0312 17:08:20.104570 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" podStartSLOduration=6.104555396 podStartE2EDuration="6.104555396s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:20.099528629 +0000 UTC m=+1042.640839958" watchObservedRunningTime="2026-03-12 17:08:20.104555396 +0000 UTC m=+1042.645866735" Mar 12 17:08:20 crc kubenswrapper[5184]: I0312 17:08:20.414662 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f66fc0-329e-4a90-92d8-2af29474e66c" path="/var/lib/kubelet/pods/62f66fc0-329e-4a90-92d8-2af29474e66c/volumes" Mar 12 17:08:21 crc kubenswrapper[5184]: I0312 17:08:21.121720 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88e09674-3ed0-4d73-bf8a-18fb1990c892","Type":"ContainerStarted","Data":"d6058a07576db84f11725468049f9596bd9d8583715c0ed9167d4d2834d6e46e"} Mar 12 17:08:21 crc kubenswrapper[5184]: I0312 17:08:21.122353 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="88e09674-3ed0-4d73-bf8a-18fb1990c892" containerName="glance-log" containerID="cri-o://402c80b7ce4447bba77c99bd9396e8d000b4925ab7347e51387d1b1717e3ede8" gracePeriod=30 Mar 12 17:08:21 crc kubenswrapper[5184]: I0312 17:08:21.122393 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="88e09674-3ed0-4d73-bf8a-18fb1990c892" containerName="glance-httpd" containerID="cri-o://d6058a07576db84f11725468049f9596bd9d8583715c0ed9167d4d2834d6e46e" gracePeriod=30 Mar 12 17:08:21 crc kubenswrapper[5184]: I0312 17:08:21.127367 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z62x6" event={"ID":"3521e399-e317-459a-badc-0b4695197ac0","Type":"ContainerStarted","Data":"450b23bd19acd57886378b3be7292a505dff6a6c249fdfd87c698a12f81b41da"} Mar 12 17:08:21 crc kubenswrapper[5184]: I0312 17:08:21.134434 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9356417-fcb6-4b97-9f16-db63e667d6e8","Type":"ContainerStarted","Data":"a949303e4554d034c5691598c489ab9d590075e25c95269468a8ef22ef860f60"} Mar 12 17:08:21 crc kubenswrapper[5184]: I0312 17:08:21.134504 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c9356417-fcb6-4b97-9f16-db63e667d6e8" containerName="glance-log" containerID="cri-o://ffe630a7a8a312c739476d9d0a0f4304e1ccee911f46d1c7407ac8521a80f736" gracePeriod=30 Mar 12 17:08:21 crc kubenswrapper[5184]: I0312 17:08:21.134534 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c9356417-fcb6-4b97-9f16-db63e667d6e8" containerName="glance-httpd" containerID="cri-o://a949303e4554d034c5691598c489ab9d590075e25c95269468a8ef22ef860f60" gracePeriod=30 Mar 12 17:08:21 crc kubenswrapper[5184]: I0312 17:08:21.142622 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dplgw" event={"ID":"d3f7d154-f90e-4731-bc04-00b13b3fbfd8","Type":"ContainerStarted","Data":"9aa3cfb372ea4933cb3cc159b27754a0f79722067cdfef5e42fed14c31a0ded5"} Mar 12 17:08:21 crc kubenswrapper[5184]: I0312 17:08:21.159219 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.159197559 podStartE2EDuration="7.159197559s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:21.149531686 +0000 UTC m=+1043.690843035" watchObservedRunningTime="2026-03-12 17:08:21.159197559 +0000 UTC m=+1043.700508898" Mar 12 17:08:21 crc kubenswrapper[5184]: I0312 17:08:21.217269 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.217257062 podStartE2EDuration="7.217257062s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:21.215458866 +0000 UTC m=+1043.756770205" watchObservedRunningTime="2026-03-12 17:08:21.217257062 +0000 UTC m=+1043.758568401" Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.150602 5184 generic.go:358] "Generic (PLEG): container finished" podID="0a764aff-f5dc-49da-a606-84cbabca2db3" containerID="2b059a0618396fd4c44a921ac9e8a2ef55e86f0811d130bfc354b1031772e615" exitCode=0 Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.150731 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5bzsl" event={"ID":"0a764aff-f5dc-49da-a606-84cbabca2db3","Type":"ContainerDied","Data":"2b059a0618396fd4c44a921ac9e8a2ef55e86f0811d130bfc354b1031772e615"} Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.155271 5184 generic.go:358] "Generic (PLEG): container finished" podID="88e09674-3ed0-4d73-bf8a-18fb1990c892" containerID="d6058a07576db84f11725468049f9596bd9d8583715c0ed9167d4d2834d6e46e" exitCode=0 Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.155290 5184 generic.go:358] "Generic (PLEG): container finished" podID="88e09674-3ed0-4d73-bf8a-18fb1990c892" containerID="402c80b7ce4447bba77c99bd9396e8d000b4925ab7347e51387d1b1717e3ede8" exitCode=143 Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.155413 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88e09674-3ed0-4d73-bf8a-18fb1990c892","Type":"ContainerDied","Data":"d6058a07576db84f11725468049f9596bd9d8583715c0ed9167d4d2834d6e46e"} Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.155479 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88e09674-3ed0-4d73-bf8a-18fb1990c892","Type":"ContainerDied","Data":"402c80b7ce4447bba77c99bd9396e8d000b4925ab7347e51387d1b1717e3ede8"} Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.157610 5184 generic.go:358] "Generic (PLEG): container finished" podID="3521e399-e317-459a-badc-0b4695197ac0" containerID="450b23bd19acd57886378b3be7292a505dff6a6c249fdfd87c698a12f81b41da" exitCode=0 Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.157824 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z62x6" event={"ID":"3521e399-e317-459a-badc-0b4695197ac0","Type":"ContainerDied","Data":"450b23bd19acd57886378b3be7292a505dff6a6c249fdfd87c698a12f81b41da"} Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.163513 5184 generic.go:358] "Generic (PLEG): container finished" podID="c9356417-fcb6-4b97-9f16-db63e667d6e8" containerID="a949303e4554d034c5691598c489ab9d590075e25c95269468a8ef22ef860f60" exitCode=0 Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.163541 5184 generic.go:358] "Generic (PLEG): container finished" podID="c9356417-fcb6-4b97-9f16-db63e667d6e8" containerID="ffe630a7a8a312c739476d9d0a0f4304e1ccee911f46d1c7407ac8521a80f736" exitCode=143 Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.163578 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9356417-fcb6-4b97-9f16-db63e667d6e8","Type":"ContainerDied","Data":"a949303e4554d034c5691598c489ab9d590075e25c95269468a8ef22ef860f60"} Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.163640 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9356417-fcb6-4b97-9f16-db63e667d6e8","Type":"ContainerDied","Data":"ffe630a7a8a312c739476d9d0a0f4304e1ccee911f46d1c7407ac8521a80f736"} Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.173086 5184 generic.go:358] "Generic (PLEG): container finished" podID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerID="9aa3cfb372ea4933cb3cc159b27754a0f79722067cdfef5e42fed14c31a0ded5" exitCode=0 Mar 12 17:08:22 crc kubenswrapper[5184]: I0312 17:08:22.173191 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dplgw" event={"ID":"d3f7d154-f90e-4731-bc04-00b13b3fbfd8","Type":"ContainerDied","Data":"9aa3cfb372ea4933cb3cc159b27754a0f79722067cdfef5e42fed14c31a0ded5"} Mar 12 17:08:24 crc kubenswrapper[5184]: I0312 17:08:24.040590 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85fc5cfbb9-trxq5"] Mar 12 17:08:24 crc kubenswrapper[5184]: I0312 17:08:24.071485 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/horizon-859ddbd78-2m2xk"] Mar 12 17:08:24 crc kubenswrapper[5184]: I0312 17:08:24.072557 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62f66fc0-329e-4a90-92d8-2af29474e66c" containerName="init" Mar 12 17:08:24 crc kubenswrapper[5184]: I0312 17:08:24.072571 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f66fc0-329e-4a90-92d8-2af29474e66c" containerName="init" Mar 12 17:08:24 crc kubenswrapper[5184]: I0312 17:08:24.072736 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="62f66fc0-329e-4a90-92d8-2af29474e66c" containerName="init" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.490487 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-859ddbd78-2m2xk"] Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.490784 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.493619 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-horizon-svc\"" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.514025 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66446fcd8f-lmflm"] Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.514077 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cd5c99b94-hgvbf"] Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.629527 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-combined-ca-bundle\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.629774 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf562d2-6ce1-4eb6-b27e-679493ce3870-logs\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.629830 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-tls-certs\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.629898 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-config-data\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.629971 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-scripts\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.630000 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-secret-key\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.630029 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t49p\" (UniqueName: \"kubernetes.io/projected/ccf562d2-6ce1-4eb6-b27e-679493ce3870-kube-api-access-7t49p\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.731751 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-combined-ca-bundle\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.731833 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf562d2-6ce1-4eb6-b27e-679493ce3870-logs\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.731857 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-tls-certs\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.731889 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-config-data\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.731926 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-scripts\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.731946 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-secret-key\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.731962 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t49p\" (UniqueName: \"kubernetes.io/projected/ccf562d2-6ce1-4eb6-b27e-679493ce3870-kube-api-access-7t49p\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.733409 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-scripts\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.733466 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-config-data\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.733700 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf562d2-6ce1-4eb6-b27e-679493ce3870-logs\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.746925 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-tls-certs\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.748497 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-combined-ca-bundle\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.750831 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-secret-key\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.752007 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t49p\" (UniqueName: \"kubernetes.io/projected/ccf562d2-6ce1-4eb6-b27e-679493ce3870-kube-api-access-7t49p\") pod \"horizon-859ddbd78-2m2xk\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:25 crc kubenswrapper[5184]: I0312 17:08:25.807953 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.588825 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.604273 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.604322 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cd5c99b94-hgvbf"] Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.673496 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97bd24c-a292-45a7-af77-526fb65b807d-combined-ca-bundle\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.674058 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a97bd24c-a292-45a7-af77-526fb65b807d-horizon-secret-key\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.675318 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a97bd24c-a292-45a7-af77-526fb65b807d-config-data\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.693160 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a97bd24c-a292-45a7-af77-526fb65b807d-scripts\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.693217 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqsf\" (UniqueName: \"kubernetes.io/projected/a97bd24c-a292-45a7-af77-526fb65b807d-kube-api-access-mjqsf\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.693514 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97bd24c-a292-45a7-af77-526fb65b807d-horizon-tls-certs\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.693563 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97bd24c-a292-45a7-af77-526fb65b807d-logs\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.697205 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6565fc964f-vn8ss"] Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.698253 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerName="dnsmasq-dns" containerID="cri-o://83b2b1df399f5f8cf5a808c8135835d11fe2ce766d9e130f13711dea7a917a36" gracePeriod=10 Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.795494 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a97bd24c-a292-45a7-af77-526fb65b807d-config-data\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.795552 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a97bd24c-a292-45a7-af77-526fb65b807d-scripts\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.796629 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqsf\" (UniqueName: \"kubernetes.io/projected/a97bd24c-a292-45a7-af77-526fb65b807d-kube-api-access-mjqsf\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.796718 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a97bd24c-a292-45a7-af77-526fb65b807d-scripts\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.796821 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97bd24c-a292-45a7-af77-526fb65b807d-horizon-tls-certs\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.796850 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97bd24c-a292-45a7-af77-526fb65b807d-logs\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.796891 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97bd24c-a292-45a7-af77-526fb65b807d-combined-ca-bundle\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.797022 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a97bd24c-a292-45a7-af77-526fb65b807d-horizon-secret-key\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.797478 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a97bd24c-a292-45a7-af77-526fb65b807d-logs\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.797604 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a97bd24c-a292-45a7-af77-526fb65b807d-config-data\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.803920 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a97bd24c-a292-45a7-af77-526fb65b807d-horizon-tls-certs\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.806168 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a97bd24c-a292-45a7-af77-526fb65b807d-horizon-secret-key\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.806502 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97bd24c-a292-45a7-af77-526fb65b807d-combined-ca-bundle\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.820579 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqsf\" (UniqueName: \"kubernetes.io/projected/a97bd24c-a292-45a7-af77-526fb65b807d-kube-api-access-mjqsf\") pod \"horizon-7cd5c99b94-hgvbf\" (UID: \"a97bd24c-a292-45a7-af77-526fb65b807d\") " pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.840624 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Mar 12 17:08:27 crc kubenswrapper[5184]: I0312 17:08:27.922689 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:28 crc kubenswrapper[5184]: I0312 17:08:28.240052 5184 generic.go:358] "Generic (PLEG): container finished" podID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerID="83b2b1df399f5f8cf5a808c8135835d11fe2ce766d9e130f13711dea7a917a36" exitCode=0 Mar 12 17:08:28 crc kubenswrapper[5184]: I0312 17:08:28.240116 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" event={"ID":"6be46ed7-a6b6-4b6e-9934-3540b1867032","Type":"ContainerDied","Data":"83b2b1df399f5f8cf5a808c8135835d11fe2ce766d9e130f13711dea7a917a36"} Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.109291 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.163760 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-combined-ca-bundle\") pod \"0a764aff-f5dc-49da-a606-84cbabca2db3\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.163863 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-credential-keys\") pod \"0a764aff-f5dc-49da-a606-84cbabca2db3\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.163911 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx775\" (UniqueName: \"kubernetes.io/projected/0a764aff-f5dc-49da-a606-84cbabca2db3-kube-api-access-bx775\") pod \"0a764aff-f5dc-49da-a606-84cbabca2db3\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.164833 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-config-data\") pod \"0a764aff-f5dc-49da-a606-84cbabca2db3\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.164923 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-scripts\") pod \"0a764aff-f5dc-49da-a606-84cbabca2db3\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.165185 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-fernet-keys\") pod \"0a764aff-f5dc-49da-a606-84cbabca2db3\" (UID: \"0a764aff-f5dc-49da-a606-84cbabca2db3\") " Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.176624 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0a764aff-f5dc-49da-a606-84cbabca2db3" (UID: "0a764aff-f5dc-49da-a606-84cbabca2db3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.177148 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0a764aff-f5dc-49da-a606-84cbabca2db3" (UID: "0a764aff-f5dc-49da-a606-84cbabca2db3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.180680 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a764aff-f5dc-49da-a606-84cbabca2db3-kube-api-access-bx775" (OuterVolumeSpecName: "kube-api-access-bx775") pod "0a764aff-f5dc-49da-a606-84cbabca2db3" (UID: "0a764aff-f5dc-49da-a606-84cbabca2db3"). InnerVolumeSpecName "kube-api-access-bx775". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.180866 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-scripts" (OuterVolumeSpecName: "scripts") pod "0a764aff-f5dc-49da-a606-84cbabca2db3" (UID: "0a764aff-f5dc-49da-a606-84cbabca2db3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.212557 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a764aff-f5dc-49da-a606-84cbabca2db3" (UID: "0a764aff-f5dc-49da-a606-84cbabca2db3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.216018 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-config-data" (OuterVolumeSpecName: "config-data") pod "0a764aff-f5dc-49da-a606-84cbabca2db3" (UID: "0a764aff-f5dc-49da-a606-84cbabca2db3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.267216 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.267245 5184 reconciler_common.go:299] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.267255 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bx775\" (UniqueName: \"kubernetes.io/projected/0a764aff-f5dc-49da-a606-84cbabca2db3-kube-api-access-bx775\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.267267 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.267274 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.267282 5184 reconciler_common.go:299] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a764aff-f5dc-49da-a606-84cbabca2db3-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.268031 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5bzsl" event={"ID":"0a764aff-f5dc-49da-a606-84cbabca2db3","Type":"ContainerDied","Data":"815ab43b87cabe34faa30eeb7ed5b4ef343c11259f5ba37b82958a2a7682297b"} Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.268068 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="815ab43b87cabe34faa30eeb7ed5b4ef343c11259f5ba37b82958a2a7682297b" Mar 12 17:08:31 crc kubenswrapper[5184]: I0312 17:08:31.268132 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5bzsl" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.298273 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5bzsl"] Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.307287 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5bzsl"] Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.390967 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9n4jh"] Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.392106 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a764aff-f5dc-49da-a606-84cbabca2db3" containerName="keystone-bootstrap" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.392120 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a764aff-f5dc-49da-a606-84cbabca2db3" containerName="keystone-bootstrap" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.392336 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a764aff-f5dc-49da-a606-84cbabca2db3" containerName="keystone-bootstrap" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.436091 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.438839 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"osp-secret\"" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.439110 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-4s8pv\"" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.439189 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.439448 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.439755 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.462711 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a764aff-f5dc-49da-a606-84cbabca2db3" path="/var/lib/kubelet/pods/0a764aff-f5dc-49da-a606-84cbabca2db3/volumes" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.463587 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9n4jh"] Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.488134 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmr42\" (UniqueName: \"kubernetes.io/projected/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-kube-api-access-zmr42\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.488983 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-config-data\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.489173 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-fernet-keys\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.489204 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-combined-ca-bundle\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.489262 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-scripts\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.489304 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-credential-keys\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.591208 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmr42\" (UniqueName: \"kubernetes.io/projected/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-kube-api-access-zmr42\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.591297 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-config-data\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.591352 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-fernet-keys\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.591368 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-combined-ca-bundle\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.592085 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-scripts\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.592140 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-credential-keys\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.596976 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-combined-ca-bundle\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.597064 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-fernet-keys\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.597349 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-credential-keys\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.599028 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-config-data\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.604813 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-scripts\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.608660 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmr42\" (UniqueName: \"kubernetes.io/projected/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-kube-api-access-zmr42\") pod \"keystone-bootstrap-9n4jh\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:32 crc kubenswrapper[5184]: I0312 17:08:32.772719 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:37 crc kubenswrapper[5184]: I0312 17:08:37.840094 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Mar 12 17:08:38 crc kubenswrapper[5184]: I0312 17:08:38.329571 5184 generic.go:358] "Generic (PLEG): container finished" podID="59bc23c2-fe37-43e7-a1a9-2830892902bf" containerID="27fa142bf68698e87c9b60da45f68b7ca5810536c39991c80fd7de3a9b2f6aab" exitCode=0 Mar 12 17:08:38 crc kubenswrapper[5184]: I0312 17:08:38.329698 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vtzf9" event={"ID":"59bc23c2-fe37-43e7-a1a9-2830892902bf","Type":"ContainerDied","Data":"27fa142bf68698e87c9b60da45f68b7ca5810536c39991c80fd7de3a9b2f6aab"} Mar 12 17:08:42 crc kubenswrapper[5184]: I0312 17:08:42.840869 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Mar 12 17:08:42 crc kubenswrapper[5184]: I0312 17:08:42.841687 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:08:43 crc kubenswrapper[5184]: I0312 17:08:43.897271 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 17:08:43 crc kubenswrapper[5184]: I0312 17:08:43.910791 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 17:08:43 crc kubenswrapper[5184]: I0312 17:08:43.912350 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.015670 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-nb\") pod \"6be46ed7-a6b6-4b6e-9934-3540b1867032\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.015726 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-sb\") pod \"6be46ed7-a6b6-4b6e-9934-3540b1867032\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.015819 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-combined-ca-bundle\") pod \"c9356417-fcb6-4b97-9f16-db63e667d6e8\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.015861 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-config-data\") pod \"c9356417-fcb6-4b97-9f16-db63e667d6e8\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.015925 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-httpd-run\") pod \"88e09674-3ed0-4d73-bf8a-18fb1990c892\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.015960 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"88e09674-3ed0-4d73-bf8a-18fb1990c892\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.015986 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-scripts\") pod \"88e09674-3ed0-4d73-bf8a-18fb1990c892\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016032 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-internal-tls-certs\") pod \"88e09674-3ed0-4d73-bf8a-18fb1990c892\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016055 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-config\") pod \"6be46ed7-a6b6-4b6e-9934-3540b1867032\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016100 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-config-data\") pod \"88e09674-3ed0-4d73-bf8a-18fb1990c892\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016136 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpqmh\" (UniqueName: \"kubernetes.io/projected/88e09674-3ed0-4d73-bf8a-18fb1990c892-kube-api-access-wpqmh\") pod \"88e09674-3ed0-4d73-bf8a-18fb1990c892\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016224 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-httpd-run\") pod \"c9356417-fcb6-4b97-9f16-db63e667d6e8\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016283 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-combined-ca-bundle\") pod \"88e09674-3ed0-4d73-bf8a-18fb1990c892\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016324 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-logs\") pod \"c9356417-fcb6-4b97-9f16-db63e667d6e8\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016357 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-public-tls-certs\") pod \"c9356417-fcb6-4b97-9f16-db63e667d6e8\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016547 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-dns-svc\") pod \"6be46ed7-a6b6-4b6e-9934-3540b1867032\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016606 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-logs\") pod \"88e09674-3ed0-4d73-bf8a-18fb1990c892\" (UID: \"88e09674-3ed0-4d73-bf8a-18fb1990c892\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016668 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-scripts\") pod \"c9356417-fcb6-4b97-9f16-db63e667d6e8\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016736 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kfqx\" (UniqueName: \"kubernetes.io/projected/6be46ed7-a6b6-4b6e-9934-3540b1867032-kube-api-access-2kfqx\") pod \"6be46ed7-a6b6-4b6e-9934-3540b1867032\" (UID: \"6be46ed7-a6b6-4b6e-9934-3540b1867032\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016762 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c9356417-fcb6-4b97-9f16-db63e667d6e8\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016789 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq98p\" (UniqueName: \"kubernetes.io/projected/c9356417-fcb6-4b97-9f16-db63e667d6e8-kube-api-access-pq98p\") pod \"c9356417-fcb6-4b97-9f16-db63e667d6e8\" (UID: \"c9356417-fcb6-4b97-9f16-db63e667d6e8\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.016899 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "88e09674-3ed0-4d73-bf8a-18fb1990c892" (UID: "88e09674-3ed0-4d73-bf8a-18fb1990c892"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.017312 5184 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.017869 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-logs" (OuterVolumeSpecName: "logs") pod "88e09674-3ed0-4d73-bf8a-18fb1990c892" (UID: "88e09674-3ed0-4d73-bf8a-18fb1990c892"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.023370 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e09674-3ed0-4d73-bf8a-18fb1990c892-kube-api-access-wpqmh" (OuterVolumeSpecName: "kube-api-access-wpqmh") pod "88e09674-3ed0-4d73-bf8a-18fb1990c892" (UID: "88e09674-3ed0-4d73-bf8a-18fb1990c892"). InnerVolumeSpecName "kube-api-access-wpqmh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.025987 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-scripts" (OuterVolumeSpecName: "scripts") pod "88e09674-3ed0-4d73-bf8a-18fb1990c892" (UID: "88e09674-3ed0-4d73-bf8a-18fb1990c892"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.026347 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-logs" (OuterVolumeSpecName: "logs") pod "c9356417-fcb6-4b97-9f16-db63e667d6e8" (UID: "c9356417-fcb6-4b97-9f16-db63e667d6e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.027406 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-scripts" (OuterVolumeSpecName: "scripts") pod "c9356417-fcb6-4b97-9f16-db63e667d6e8" (UID: "c9356417-fcb6-4b97-9f16-db63e667d6e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.032139 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be46ed7-a6b6-4b6e-9934-3540b1867032-kube-api-access-2kfqx" (OuterVolumeSpecName: "kube-api-access-2kfqx") pod "6be46ed7-a6b6-4b6e-9934-3540b1867032" (UID: "6be46ed7-a6b6-4b6e-9934-3540b1867032"). InnerVolumeSpecName "kube-api-access-2kfqx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.035787 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "88e09674-3ed0-4d73-bf8a-18fb1990c892" (UID: "88e09674-3ed0-4d73-bf8a-18fb1990c892"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.040795 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c9356417-fcb6-4b97-9f16-db63e667d6e8" (UID: "c9356417-fcb6-4b97-9f16-db63e667d6e8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.040803 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "c9356417-fcb6-4b97-9f16-db63e667d6e8" (UID: "c9356417-fcb6-4b97-9f16-db63e667d6e8"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.047908 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9356417-fcb6-4b97-9f16-db63e667d6e8-kube-api-access-pq98p" (OuterVolumeSpecName: "kube-api-access-pq98p") pod "c9356417-fcb6-4b97-9f16-db63e667d6e8" (UID: "c9356417-fcb6-4b97-9f16-db63e667d6e8"). InnerVolumeSpecName "kube-api-access-pq98p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.075708 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6be46ed7-a6b6-4b6e-9934-3540b1867032" (UID: "6be46ed7-a6b6-4b6e-9934-3540b1867032"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.079600 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9356417-fcb6-4b97-9f16-db63e667d6e8" (UID: "c9356417-fcb6-4b97-9f16-db63e667d6e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.093297 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-config-data" (OuterVolumeSpecName: "config-data") pod "88e09674-3ed0-4d73-bf8a-18fb1990c892" (UID: "88e09674-3ed0-4d73-bf8a-18fb1990c892"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.093715 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88e09674-3ed0-4d73-bf8a-18fb1990c892" (UID: "88e09674-3ed0-4d73-bf8a-18fb1990c892"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.096563 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6be46ed7-a6b6-4b6e-9934-3540b1867032" (UID: "6be46ed7-a6b6-4b6e-9934-3540b1867032"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.097130 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88e09674-3ed0-4d73-bf8a-18fb1990c892" (UID: "88e09674-3ed0-4d73-bf8a-18fb1990c892"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.099678 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6be46ed7-a6b6-4b6e-9934-3540b1867032" (UID: "6be46ed7-a6b6-4b6e-9934-3540b1867032"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.106401 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-config" (OuterVolumeSpecName: "config") pod "6be46ed7-a6b6-4b6e-9934-3540b1867032" (UID: "6be46ed7-a6b6-4b6e-9934-3540b1867032"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.106835 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c9356417-fcb6-4b97-9f16-db63e667d6e8" (UID: "c9356417-fcb6-4b97-9f16-db63e667d6e8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.116791 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-config-data" (OuterVolumeSpecName: "config-data") pod "c9356417-fcb6-4b97-9f16-db63e667d6e8" (UID: "c9356417-fcb6-4b97-9f16-db63e667d6e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119296 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2kfqx\" (UniqueName: \"kubernetes.io/projected/6be46ed7-a6b6-4b6e-9934-3540b1867032-kube-api-access-2kfqx\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119347 5184 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119358 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pq98p\" (UniqueName: \"kubernetes.io/projected/c9356417-fcb6-4b97-9f16-db63e667d6e8-kube-api-access-pq98p\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119368 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119391 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119399 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119407 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119425 5184 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119434 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119468 5184 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119477 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119485 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119493 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wpqmh\" (UniqueName: \"kubernetes.io/projected/88e09674-3ed0-4d73-bf8a-18fb1990c892-kube-api-access-wpqmh\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119501 5184 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119510 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88e09674-3ed0-4d73-bf8a-18fb1990c892-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119517 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9356417-fcb6-4b97-9f16-db63e667d6e8-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119524 5184 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119533 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6be46ed7-a6b6-4b6e-9934-3540b1867032-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119540 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88e09674-3ed0-4d73-bf8a-18fb1990c892-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.119547 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9356417-fcb6-4b97-9f16-db63e667d6e8-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.137751 5184 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.137829 5184 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.220896 5184 reconciler_common.go:299] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.220926 5184 reconciler_common.go:299] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.395081 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.395189 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" event={"ID":"6be46ed7-a6b6-4b6e-9934-3540b1867032","Type":"ContainerDied","Data":"08890ab80e9826f22128ea18c42d0bf35edc4e124dc38eb581b94dab4860777f"} Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.395293 5184 scope.go:117] "RemoveContainer" containerID="83b2b1df399f5f8cf5a808c8135835d11fe2ce766d9e130f13711dea7a917a36" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.397910 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"88e09674-3ed0-4d73-bf8a-18fb1990c892","Type":"ContainerDied","Data":"8955c3057fe003ce0bb49873dec406ab7ae450a620f92d72ce46efed7c2dcc7d"} Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.398140 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.405562 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.420322 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c9356417-fcb6-4b97-9f16-db63e667d6e8","Type":"ContainerDied","Data":"388bd2c7c9f9ca60bdec7bd29174a48d468c5b7ebc7672fbfab2464949c0c3ba"} Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.441772 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6565fc964f-vn8ss"] Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.454196 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6565fc964f-vn8ss"] Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.464606 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.473672 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.496400 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.509254 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.519964 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521189 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9356417-fcb6-4b97-9f16-db63e667d6e8" containerName="glance-log" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521279 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9356417-fcb6-4b97-9f16-db63e667d6e8" containerName="glance-log" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521348 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9356417-fcb6-4b97-9f16-db63e667d6e8" containerName="glance-httpd" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521432 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9356417-fcb6-4b97-9f16-db63e667d6e8" containerName="glance-httpd" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521518 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88e09674-3ed0-4d73-bf8a-18fb1990c892" containerName="glance-httpd" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521604 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e09674-3ed0-4d73-bf8a-18fb1990c892" containerName="glance-httpd" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521718 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerName="dnsmasq-dns" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521778 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerName="dnsmasq-dns" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521843 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88e09674-3ed0-4d73-bf8a-18fb1990c892" containerName="glance-log" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521899 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e09674-3ed0-4d73-bf8a-18fb1990c892" containerName="glance-log" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.521957 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerName="init" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.522014 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerName="init" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.522223 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerName="dnsmasq-dns" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.522289 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9356417-fcb6-4b97-9f16-db63e667d6e8" containerName="glance-log" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.522352 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="88e09674-3ed0-4d73-bf8a-18fb1990c892" containerName="glance-httpd" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.522440 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="88e09674-3ed0-4d73-bf8a-18fb1990c892" containerName="glance-log" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.522503 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9356417-fcb6-4b97-9f16-db63e667d6e8" containerName="glance-httpd" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.578362 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.578527 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.578778 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.581697 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-glance-dockercfg-kvq4j\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.581999 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-external-config-data\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.582725 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-glance-default-public-svc\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.583733 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-scripts\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.586914 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.587090 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.589063 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-glance-default-internal-svc\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.589215 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-internal-config-data\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.620631 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728256 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5qtj\" (UniqueName: \"kubernetes.io/projected/59bc23c2-fe37-43e7-a1a9-2830892902bf-kube-api-access-t5qtj\") pod \"59bc23c2-fe37-43e7-a1a9-2830892902bf\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728392 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-combined-ca-bundle\") pod \"59bc23c2-fe37-43e7-a1a9-2830892902bf\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728512 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-config\") pod \"59bc23c2-fe37-43e7-a1a9-2830892902bf\" (UID: \"59bc23c2-fe37-43e7-a1a9-2830892902bf\") " Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728766 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728791 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728807 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh2fs\" (UniqueName: \"kubernetes.io/projected/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-kube-api-access-nh2fs\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728834 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-scripts\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728870 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-logs\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728898 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728921 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728944 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.728962 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.729022 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.729039 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-config-data\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.729055 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.729083 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhb6t\" (UniqueName: \"kubernetes.io/projected/4e21b41b-8457-4bae-b2f8-fd29ea43334a-kube-api-access-jhb6t\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.729113 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.729137 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.729164 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.734394 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bc23c2-fe37-43e7-a1a9-2830892902bf-kube-api-access-t5qtj" (OuterVolumeSpecName: "kube-api-access-t5qtj") pod "59bc23c2-fe37-43e7-a1a9-2830892902bf" (UID: "59bc23c2-fe37-43e7-a1a9-2830892902bf"). InnerVolumeSpecName "kube-api-access-t5qtj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.752927 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59bc23c2-fe37-43e7-a1a9-2830892902bf" (UID: "59bc23c2-fe37-43e7-a1a9-2830892902bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.766287 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-config" (OuterVolumeSpecName: "config") pod "59bc23c2-fe37-43e7-a1a9-2830892902bf" (UID: "59bc23c2-fe37-43e7-a1a9-2830892902bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.835111 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.835162 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.835182 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh2fs\" (UniqueName: \"kubernetes.io/projected/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-kube-api-access-nh2fs\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.835440 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.835806 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-scripts\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.836037 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-logs\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.836147 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.836222 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.836306 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.836354 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.836599 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.836638 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-config-data\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.836661 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.836743 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhb6t\" (UniqueName: \"kubernetes.io/projected/4e21b41b-8457-4bae-b2f8-fd29ea43334a-kube-api-access-jhb6t\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.836768 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-logs\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.837401 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.837660 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.837759 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.837897 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.838096 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.838356 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t5qtj\" (UniqueName: \"kubernetes.io/projected/59bc23c2-fe37-43e7-a1a9-2830892902bf-kube-api-access-t5qtj\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.838453 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.838840 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/59bc23c2-fe37-43e7-a1a9-2830892902bf-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.838691 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.839305 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.846235 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-scripts\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.846360 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.846703 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.847042 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.853445 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.853452 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.853858 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-config-data\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.855097 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.859999 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhb6t\" (UniqueName: \"kubernetes.io/projected/4e21b41b-8457-4bae-b2f8-fd29ea43334a-kube-api-access-jhb6t\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.860219 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh2fs\" (UniqueName: \"kubernetes.io/projected/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-kube-api-access-nh2fs\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.879767 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.885783 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.898323 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 17:08:44 crc kubenswrapper[5184]: I0312 17:08:44.936282 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 17:08:45 crc kubenswrapper[5184]: I0312 17:08:45.438819 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vtzf9" Mar 12 17:08:45 crc kubenswrapper[5184]: I0312 17:08:45.438948 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vtzf9" event={"ID":"59bc23c2-fe37-43e7-a1a9-2830892902bf","Type":"ContainerDied","Data":"fe6b77bb72787b75021ac12606ee3a1cb7be757bda61a018f84879b452739ff1"} Mar 12 17:08:45 crc kubenswrapper[5184]: I0312 17:08:45.439255 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe6b77bb72787b75021ac12606ee3a1cb7be757bda61a018f84879b452739ff1" Mar 12 17:08:45 crc kubenswrapper[5184]: I0312 17:08:45.940564 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57fc7b664c-5c5pt"] Mar 12 17:08:45 crc kubenswrapper[5184]: I0312 17:08:45.944905 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59bc23c2-fe37-43e7-a1a9-2830892902bf" containerName="neutron-db-sync" Mar 12 17:08:45 crc kubenswrapper[5184]: I0312 17:08:45.944940 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bc23c2-fe37-43e7-a1a9-2830892902bf" containerName="neutron-db-sync" Mar 12 17:08:45 crc kubenswrapper[5184]: I0312 17:08:45.945147 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="59bc23c2-fe37-43e7-a1a9-2830892902bf" containerName="neutron-db-sync" Mar 12 17:08:45 crc kubenswrapper[5184]: I0312 17:08:45.958524 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:45 crc kubenswrapper[5184]: I0312 17:08:45.969689 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57fc7b664c-5c5pt"] Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.032240 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-869b7dc84d-67g2c"] Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.041716 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.043713 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-neutron-ovndbs\"" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.044898 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-neutron-dockercfg-pd56m\"" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.044972 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-config\"" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.045190 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"neutron-httpd-config\"" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.061051 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-869b7dc84d-67g2c"] Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.062005 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-nb\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.062064 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-svc\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.062131 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-config\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.062238 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-swift-storage-0\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.062276 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-sb\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.062338 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skt42\" (UniqueName: \"kubernetes.io/projected/013bb197-eb0e-4632-90f8-547f4452101e-kube-api-access-skt42\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164053 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-swift-storage-0\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164109 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-config\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164143 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-sb\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164197 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-ovndb-tls-certs\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164355 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skt42\" (UniqueName: \"kubernetes.io/projected/013bb197-eb0e-4632-90f8-547f4452101e-kube-api-access-skt42\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164425 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-nb\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164457 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-svc\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164482 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-httpd-config\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164507 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-combined-ca-bundle\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164550 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-config\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.164635 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzz4j\" (UniqueName: \"kubernetes.io/projected/5c4af3c8-3189-41d2-9709-336561190b17-kube-api-access-kzz4j\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.165240 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-sb\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.165923 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-nb\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.166150 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-svc\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.166770 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-swift-storage-0\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.168805 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-config\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.203266 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skt42\" (UniqueName: \"kubernetes.io/projected/013bb197-eb0e-4632-90f8-547f4452101e-kube-api-access-skt42\") pod \"dnsmasq-dns-57fc7b664c-5c5pt\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.266035 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-ovndb-tls-certs\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.266140 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-httpd-config\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.266172 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-combined-ca-bundle\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.266292 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzz4j\" (UniqueName: \"kubernetes.io/projected/5c4af3c8-3189-41d2-9709-336561190b17-kube-api-access-kzz4j\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.266333 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-config\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.277941 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-ovndb-tls-certs\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.278908 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-combined-ca-bundle\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.278951 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-httpd-config\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.282283 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-config\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.289090 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.298682 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzz4j\" (UniqueName: \"kubernetes.io/projected/5c4af3c8-3189-41d2-9709-336561190b17-kube-api-access-kzz4j\") pod \"neutron-869b7dc84d-67g2c\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.329282 5184 scope.go:117] "RemoveContainer" containerID="b99149c5397d7c2860370daf5a6ba8792284074fb31c5d02e3ebf1998450131b" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.364548 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.429298 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" path="/var/lib/kubelet/pods/6be46ed7-a6b6-4b6e-9934-3540b1867032/volumes" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.430658 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e09674-3ed0-4d73-bf8a-18fb1990c892" path="/var/lib/kubelet/pods/88e09674-3ed0-4d73-bf8a-18fb1990c892/volumes" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.432098 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9356417-fcb6-4b97-9f16-db63e667d6e8" path="/var/lib/kubelet/pods/c9356417-fcb6-4b97-9f16-db63e667d6e8/volumes" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.482642 5184 scope.go:117] "RemoveContainer" containerID="d6058a07576db84f11725468049f9596bd9d8583715c0ed9167d4d2834d6e46e" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.655291 5184 scope.go:117] "RemoveContainer" containerID="402c80b7ce4447bba77c99bd9396e8d000b4925ab7347e51387d1b1717e3ede8" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.761356 5184 scope.go:117] "RemoveContainer" containerID="a949303e4554d034c5691598c489ab9d590075e25c95269468a8ef22ef860f60" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.795697 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-859ddbd78-2m2xk"] Mar 12 17:08:46 crc kubenswrapper[5184]: W0312 17:08:46.832006 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccf562d2_6ce1_4eb6_b27e_679493ce3870.slice/crio-6ae4351238b6fdee8112497a216a6ee7335c267ada156a2017b6669139243873 WatchSource:0}: Error finding container 6ae4351238b6fdee8112497a216a6ee7335c267ada156a2017b6669139243873: Status 404 returned error can't find the container with id 6ae4351238b6fdee8112497a216a6ee7335c267ada156a2017b6669139243873 Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.851422 5184 scope.go:117] "RemoveContainer" containerID="ffe630a7a8a312c739476d9d0a0f4304e1ccee911f46d1c7407ac8521a80f736" Mar 12 17:08:46 crc kubenswrapper[5184]: I0312 17:08:46.928309 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cd5c99b94-hgvbf"] Mar 12 17:08:46 crc kubenswrapper[5184]: W0312 17:08:46.968733 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda97bd24c_a292_45a7_af77_526fb65b807d.slice/crio-5e0f6d8637f1822307636666b0442821da0ad3d1d1bfb81f4079abd42a6a3d76 WatchSource:0}: Error finding container 5e0f6d8637f1822307636666b0442821da0ad3d1d1bfb81f4079abd42a6a3d76: Status 404 returned error can't find the container with id 5e0f6d8637f1822307636666b0442821da0ad3d1d1bfb81f4079abd42a6a3d76 Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.072262 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9n4jh"] Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.177468 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.187679 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57fc7b664c-5c5pt"] Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.269561 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:08:47 crc kubenswrapper[5184]: W0312 17:08:47.318844 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e21b41b_8457_4bae_b2f8_fd29ea43334a.slice/crio-02a52ccd6bdc6a772b9a1149b471146571553e9f7d66b1c58e6f9e7eb5a1d43c WatchSource:0}: Error finding container 02a52ccd6bdc6a772b9a1149b471146571553e9f7d66b1c58e6f9e7eb5a1d43c: Status 404 returned error can't find the container with id 02a52ccd6bdc6a772b9a1149b471146571553e9f7d66b1c58e6f9e7eb5a1d43c Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.455471 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-869b7dc84d-67g2c"] Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.583298 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fc5cfbb9-trxq5" event={"ID":"646cdc1b-863a-4b58-8869-fcbc386a96e2","Type":"ContainerStarted","Data":"7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.672683 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b468785-jt9ph" event={"ID":"ad645834-0761-42a0-8bf0-dd763b829aac","Type":"ContainerStarted","Data":"047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.683044 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jb66b" event={"ID":"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c","Type":"ContainerStarted","Data":"a4f5d3d4c700f2d9a238c75ab41843bc64c0d8f971ab4ddd7ceced4d12313fac"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.701402 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z62x6" event={"ID":"3521e399-e317-459a-badc-0b4695197ac0","Type":"ContainerStarted","Data":"af15e72be38b83f42e895efc6440a892f9566b54625ee6e6698da088011c7906"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.716261 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0","Type":"ContainerStarted","Data":"43c61f4aefd4d16bc7578ab5dbebf27ad6a77294d523115e63431684d035e390"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.739642 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859ddbd78-2m2xk" event={"ID":"ccf562d2-6ce1-4eb6-b27e-679493ce3870","Type":"ContainerStarted","Data":"881e288299526eeda5e8bb60032448ab57c594785adad11983b46f43c6e06ae3"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.739694 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859ddbd78-2m2xk" event={"ID":"ccf562d2-6ce1-4eb6-b27e-679493ce3870","Type":"ContainerStarted","Data":"6ae4351238b6fdee8112497a216a6ee7335c267ada156a2017b6669139243873"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.760536 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dplgw" event={"ID":"d3f7d154-f90e-4731-bc04-00b13b3fbfd8","Type":"ContainerStarted","Data":"6ca1f0347c3e06245aeafb34315f536429f7af0ae6532d71a76abc294f5798c0"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.763635 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z62x6" podStartSLOduration=30.08892573 podStartE2EDuration="31.763615107s" podCreationTimestamp="2026-03-12 17:08:16 +0000 UTC" firstStartedPulling="2026-03-12 17:08:19.034172179 +0000 UTC m=+1041.575483518" lastFinishedPulling="2026-03-12 17:08:20.708861556 +0000 UTC m=+1043.250172895" observedRunningTime="2026-03-12 17:08:47.739311504 +0000 UTC m=+1070.280622843" watchObservedRunningTime="2026-03-12 17:08:47.763615107 +0000 UTC m=+1070.304926446" Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.766588 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-jb66b" podStartSLOduration=4.059606077 podStartE2EDuration="33.76657669s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="2026-03-12 17:08:16.64824694 +0000 UTC m=+1039.189558279" lastFinishedPulling="2026-03-12 17:08:46.355217553 +0000 UTC m=+1068.896528892" observedRunningTime="2026-03-12 17:08:47.713768271 +0000 UTC m=+1070.255079610" watchObservedRunningTime="2026-03-12 17:08:47.76657669 +0000 UTC m=+1070.307888039" Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.769033 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1","Type":"ContainerStarted","Data":"229c265b683e912c620e9e0f730acb9df2d8475e613170a912a122643740e27e"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.778737 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" event={"ID":"013bb197-eb0e-4632-90f8-547f4452101e","Type":"ContainerStarted","Data":"43ec1527ab1c19dd2cd035ec33fb4862fedc21da31b0a83585feada4917d9b7e"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.787561 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dplgw" podStartSLOduration=32.275654566 podStartE2EDuration="33.787535339s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="2026-03-12 17:08:19.048457968 +0000 UTC m=+1041.589769307" lastFinishedPulling="2026-03-12 17:08:20.560338751 +0000 UTC m=+1043.101650080" observedRunningTime="2026-03-12 17:08:47.780327422 +0000 UTC m=+1070.321638771" watchObservedRunningTime="2026-03-12 17:08:47.787535339 +0000 UTC m=+1070.328846688" Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.806762 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4e21b41b-8457-4bae-b2f8-fd29ea43334a","Type":"ContainerStarted","Data":"02a52ccd6bdc6a772b9a1149b471146571553e9f7d66b1c58e6f9e7eb5a1d43c"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.824150 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9n4jh" event={"ID":"c7d8d368-22c0-41a1-972c-d8f7c14db7b5","Type":"ContainerStarted","Data":"15710ab138fee64b5890ce8cb78ef2ea1e3312861df961e0914504080e5cf951"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.842167 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6565fc964f-vn8ss" podUID="6be46ed7-a6b6-4b6e-9934-3540b1867032" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.864363 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66446fcd8f-lmflm" event={"ID":"af22a969-32c4-4628-8667-be2162f7d92d","Type":"ContainerStarted","Data":"12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.866521 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd5c99b94-hgvbf" event={"ID":"a97bd24c-a292-45a7-af77-526fb65b807d","Type":"ContainerStarted","Data":"5e0f6d8637f1822307636666b0442821da0ad3d1d1bfb81f4079abd42a6a3d76"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.868803 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mvj44" event={"ID":"a9bd8488-49bb-48df-8f41-f415f71a2834","Type":"ContainerStarted","Data":"c54692a79c5db321094440b5acb8b5c54fd9f4751d54741eb605e33289007cca"} Mar 12 17:08:47 crc kubenswrapper[5184]: I0312 17:08:47.885619 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mvj44" podStartSLOduration=6.05192159 podStartE2EDuration="33.885598678s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="2026-03-12 17:08:16.669127578 +0000 UTC m=+1039.210438917" lastFinishedPulling="2026-03-12 17:08:44.502804666 +0000 UTC m=+1067.044116005" observedRunningTime="2026-03-12 17:08:47.88470087 +0000 UTC m=+1070.426012229" watchObservedRunningTime="2026-03-12 17:08:47.885598678 +0000 UTC m=+1070.426910017" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.392731 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-b69fb49f7-dmgsd"] Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.440573 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b69fb49f7-dmgsd"] Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.441825 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.448835 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-neutron-internal-svc\"" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.448947 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-neutron-public-svc\"" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.554679 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r9wq\" (UniqueName: \"kubernetes.io/projected/0765e7b4-b879-4989-8a31-486408b9cdce-kube-api-access-7r9wq\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.555103 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-internal-tls-certs\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.555201 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-public-tls-certs\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.555286 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-combined-ca-bundle\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.555408 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-config\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.555507 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-ovndb-tls-certs\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.555615 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-httpd-config\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.658139 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r9wq\" (UniqueName: \"kubernetes.io/projected/0765e7b4-b879-4989-8a31-486408b9cdce-kube-api-access-7r9wq\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.658203 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-internal-tls-certs\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.658237 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-public-tls-certs\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.658281 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-combined-ca-bundle\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.658331 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-config\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.658356 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-ovndb-tls-certs\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.658411 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-httpd-config\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.674599 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-httpd-config\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.684997 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r9wq\" (UniqueName: \"kubernetes.io/projected/0765e7b4-b879-4989-8a31-486408b9cdce-kube-api-access-7r9wq\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.696102 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-public-tls-certs\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.698289 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-internal-tls-certs\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.698613 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-config\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.698455 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-combined-ca-bundle\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.728968 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-ovndb-tls-certs\") pod \"neutron-b69fb49f7-dmgsd\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.858694 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.939757 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859ddbd78-2m2xk" event={"ID":"ccf562d2-6ce1-4eb6-b27e-679493ce3870","Type":"ContainerStarted","Data":"1b15bf9717411285614dfab9d07e6784fa5f11a28ed47e3b1c3e31c203d181c9"} Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.953426 5184 generic.go:358] "Generic (PLEG): container finished" podID="013bb197-eb0e-4632-90f8-547f4452101e" containerID="fe1ac0d55ed0615152e6b57fc5ecdb053d80f29d233ac256a283434a78ee3f31" exitCode=0 Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.953537 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" event={"ID":"013bb197-eb0e-4632-90f8-547f4452101e","Type":"ContainerDied","Data":"fe1ac0d55ed0615152e6b57fc5ecdb053d80f29d233ac256a283434a78ee3f31"} Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.968683 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-859ddbd78-2m2xk" podStartSLOduration=24.968662614 podStartE2EDuration="24.968662614s" podCreationTimestamp="2026-03-12 17:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:48.960065413 +0000 UTC m=+1071.501376762" watchObservedRunningTime="2026-03-12 17:08:48.968662614 +0000 UTC m=+1071.509973963" Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.969761 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9n4jh" event={"ID":"c7d8d368-22c0-41a1-972c-d8f7c14db7b5","Type":"ContainerStarted","Data":"3c8cc2178ecb3ec4d8690b45d2bc35f21e9f8b1578a6c5175cf7f61befac352a"} Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.988279 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66446fcd8f-lmflm" event={"ID":"af22a969-32c4-4628-8667-be2162f7d92d","Type":"ContainerStarted","Data":"e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f"} Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.988504 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-66446fcd8f-lmflm" podUID="af22a969-32c4-4628-8667-be2162f7d92d" containerName="horizon-log" containerID="cri-o://12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364" gracePeriod=30 Mar 12 17:08:48 crc kubenswrapper[5184]: I0312 17:08:48.988854 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-66446fcd8f-lmflm" podUID="af22a969-32c4-4628-8667-be2162f7d92d" containerName="horizon" containerID="cri-o://e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f" gracePeriod=30 Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.008531 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9n4jh" podStartSLOduration=17.008508844 podStartE2EDuration="17.008508844s" podCreationTimestamp="2026-03-12 17:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:48.999816812 +0000 UTC m=+1071.541128151" watchObservedRunningTime="2026-03-12 17:08:49.008508844 +0000 UTC m=+1071.549820183" Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.028575 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66446fcd8f-lmflm" podStartSLOduration=4.224151989 podStartE2EDuration="32.028557474s" podCreationTimestamp="2026-03-12 17:08:17 +0000 UTC" firstStartedPulling="2026-03-12 17:08:18.394147468 +0000 UTC m=+1040.935458807" lastFinishedPulling="2026-03-12 17:08:46.198552943 +0000 UTC m=+1068.739864292" observedRunningTime="2026-03-12 17:08:49.028177592 +0000 UTC m=+1071.569488931" watchObservedRunningTime="2026-03-12 17:08:49.028557474 +0000 UTC m=+1071.569868813" Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.035680 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869b7dc84d-67g2c" event={"ID":"5c4af3c8-3189-41d2-9709-336561190b17","Type":"ContainerStarted","Data":"98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb"} Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.035725 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869b7dc84d-67g2c" event={"ID":"5c4af3c8-3189-41d2-9709-336561190b17","Type":"ContainerStarted","Data":"69fa3abf61438d4466c6d5dc0bf7516773400dafab1b9851080e68dedc08a11e"} Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.045837 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd5c99b94-hgvbf" event={"ID":"a97bd24c-a292-45a7-af77-526fb65b807d","Type":"ContainerStarted","Data":"ce396d6beda1425dd00c72e6a04c9fcc6785648bedeeaf752d0bd17c96946bc6"} Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.045956 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd5c99b94-hgvbf" event={"ID":"a97bd24c-a292-45a7-af77-526fb65b807d","Type":"ContainerStarted","Data":"0a09c1b88427c42f06ac5a716325b6722c0e9aae580cb929d7583bfa1e307f23"} Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.073811 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7cd5c99b94-hgvbf" podStartSLOduration=25.073794395 podStartE2EDuration="25.073794395s" podCreationTimestamp="2026-03-12 17:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:49.063800681 +0000 UTC m=+1071.605112020" watchObservedRunningTime="2026-03-12 17:08:49.073794395 +0000 UTC m=+1071.615105734" Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.078966 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fc5cfbb9-trxq5" event={"ID":"646cdc1b-863a-4b58-8869-fcbc386a96e2","Type":"ContainerStarted","Data":"030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72"} Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.079044 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-85fc5cfbb9-trxq5" podUID="646cdc1b-863a-4b58-8869-fcbc386a96e2" containerName="horizon-log" containerID="cri-o://7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9" gracePeriod=30 Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.079166 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-85fc5cfbb9-trxq5" podUID="646cdc1b-863a-4b58-8869-fcbc386a96e2" containerName="horizon" containerID="cri-o://030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72" gracePeriod=30 Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.114613 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-844b468785-jt9ph" podUID="ad645834-0761-42a0-8bf0-dd763b829aac" containerName="horizon-log" containerID="cri-o://047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537" gracePeriod=30 Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.114760 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-844b468785-jt9ph" podUID="ad645834-0761-42a0-8bf0-dd763b829aac" containerName="horizon" containerID="cri-o://322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055" gracePeriod=30 Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.115065 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b468785-jt9ph" event={"ID":"ad645834-0761-42a0-8bf0-dd763b829aac","Type":"ContainerStarted","Data":"322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055"} Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.179201 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-844b468785-jt9ph" podStartSLOduration=5.49662171 podStartE2EDuration="35.179178514s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="2026-03-12 17:08:16.672635549 +0000 UTC m=+1039.213946888" lastFinishedPulling="2026-03-12 17:08:46.355192353 +0000 UTC m=+1068.896503692" observedRunningTime="2026-03-12 17:08:49.17393768 +0000 UTC m=+1071.715249019" watchObservedRunningTime="2026-03-12 17:08:49.179178514 +0000 UTC m=+1071.720489853" Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.192872 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85fc5cfbb9-trxq5" podStartSLOduration=4.397198241 podStartE2EDuration="35.192815954s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="2026-03-12 17:08:15.583896614 +0000 UTC m=+1038.125207953" lastFinishedPulling="2026-03-12 17:08:46.379514327 +0000 UTC m=+1068.920825666" observedRunningTime="2026-03-12 17:08:49.123409343 +0000 UTC m=+1071.664720682" watchObservedRunningTime="2026-03-12 17:08:49.192815954 +0000 UTC m=+1071.734127313" Mar 12 17:08:49 crc kubenswrapper[5184]: I0312 17:08:49.694893 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b69fb49f7-dmgsd"] Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.183170 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" event={"ID":"013bb197-eb0e-4632-90f8-547f4452101e","Type":"ContainerStarted","Data":"73563c03de213797146af3ecfeee7554e26f5bda4f7b0baa55a1074fb56d8797"} Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.183610 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.234630 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4e21b41b-8457-4bae-b2f8-fd29ea43334a","Type":"ContainerStarted","Data":"e5d8d20981f57a472f11e6505aa691fbe2ff3ee1edb7636736bd11c49d37d8b1"} Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.270658 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869b7dc84d-67g2c" event={"ID":"5c4af3c8-3189-41d2-9709-336561190b17","Type":"ContainerStarted","Data":"2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d"} Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.271721 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.292176 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" podStartSLOduration=5.29215577 podStartE2EDuration="5.29215577s" podCreationTimestamp="2026-03-12 17:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:50.21891749 +0000 UTC m=+1072.760228829" watchObservedRunningTime="2026-03-12 17:08:50.29215577 +0000 UTC m=+1072.833467109" Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.303550 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-869b7dc84d-67g2c" podStartSLOduration=5.303529177 podStartE2EDuration="5.303529177s" podCreationTimestamp="2026-03-12 17:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:50.294553005 +0000 UTC m=+1072.835864334" watchObservedRunningTime="2026-03-12 17:08:50.303529177 +0000 UTC m=+1072.844840516" Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.303781 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b69fb49f7-dmgsd" event={"ID":"0765e7b4-b879-4989-8a31-486408b9cdce","Type":"ContainerStarted","Data":"02651e41906adbf9399ea4315baee55d5d8dc38b6405087e216eadb03fdfa330"} Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.325551 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0","Type":"ContainerStarted","Data":"18266ffb33d0bae79da2c0ef8d203cdbb98ba1e2876ad73a5200b2db877237a4"} Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.744262 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:08:50 crc kubenswrapper[5184]: I0312 17:08:50.744584 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:08:51 crc kubenswrapper[5184]: I0312 17:08:51.343352 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b69fb49f7-dmgsd" event={"ID":"0765e7b4-b879-4989-8a31-486408b9cdce","Type":"ContainerStarted","Data":"3b99a04cd7775019fa369700f675593f34d3fef966fdf724f5a08caef1917514"} Mar 12 17:08:51 crc kubenswrapper[5184]: I0312 17:08:51.346448 5184 generic.go:358] "Generic (PLEG): container finished" podID="a9bd8488-49bb-48df-8f41-f415f71a2834" containerID="c54692a79c5db321094440b5acb8b5c54fd9f4751d54741eb605e33289007cca" exitCode=0 Mar 12 17:08:51 crc kubenswrapper[5184]: I0312 17:08:51.346540 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mvj44" event={"ID":"a9bd8488-49bb-48df-8f41-f415f71a2834","Type":"ContainerDied","Data":"c54692a79c5db321094440b5acb8b5c54fd9f4751d54741eb605e33289007cca"} Mar 12 17:08:51 crc kubenswrapper[5184]: E0312 17:08:51.356772 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1753616 actualBytes=10240 Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.810204 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.888396 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-config-data\") pod \"a9bd8488-49bb-48df-8f41-f415f71a2834\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.888776 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dgzq\" (UniqueName: \"kubernetes.io/projected/a9bd8488-49bb-48df-8f41-f415f71a2834-kube-api-access-4dgzq\") pod \"a9bd8488-49bb-48df-8f41-f415f71a2834\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.888805 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9bd8488-49bb-48df-8f41-f415f71a2834-logs\") pod \"a9bd8488-49bb-48df-8f41-f415f71a2834\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.888930 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-scripts\") pod \"a9bd8488-49bb-48df-8f41-f415f71a2834\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.889123 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-combined-ca-bundle\") pod \"a9bd8488-49bb-48df-8f41-f415f71a2834\" (UID: \"a9bd8488-49bb-48df-8f41-f415f71a2834\") " Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.889511 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9bd8488-49bb-48df-8f41-f415f71a2834-logs" (OuterVolumeSpecName: "logs") pod "a9bd8488-49bb-48df-8f41-f415f71a2834" (UID: "a9bd8488-49bb-48df-8f41-f415f71a2834"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.889881 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9bd8488-49bb-48df-8f41-f415f71a2834-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.898746 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-scripts" (OuterVolumeSpecName: "scripts") pod "a9bd8488-49bb-48df-8f41-f415f71a2834" (UID: "a9bd8488-49bb-48df-8f41-f415f71a2834"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.899603 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9bd8488-49bb-48df-8f41-f415f71a2834-kube-api-access-4dgzq" (OuterVolumeSpecName: "kube-api-access-4dgzq") pod "a9bd8488-49bb-48df-8f41-f415f71a2834" (UID: "a9bd8488-49bb-48df-8f41-f415f71a2834"). InnerVolumeSpecName "kube-api-access-4dgzq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.935637 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9bd8488-49bb-48df-8f41-f415f71a2834" (UID: "a9bd8488-49bb-48df-8f41-f415f71a2834"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.946915 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-config-data" (OuterVolumeSpecName: "config-data") pod "a9bd8488-49bb-48df-8f41-f415f71a2834" (UID: "a9bd8488-49bb-48df-8f41-f415f71a2834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.991706 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.991743 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4dgzq\" (UniqueName: \"kubernetes.io/projected/a9bd8488-49bb-48df-8f41-f415f71a2834-kube-api-access-4dgzq\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.991753 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:52 crc kubenswrapper[5184]: I0312 17:08:52.991761 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9bd8488-49bb-48df-8f41-f415f71a2834-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:53 crc kubenswrapper[5184]: I0312 17:08:53.368592 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gzjgn" event={"ID":"7ed3fdf4-b869-4ef8-b746-afe1e52fe286","Type":"ContainerStarted","Data":"45049197c84547cb9b08115ad2142cab8b478837d30511a82dba357efd5b8ef0"} Mar 12 17:08:53 crc kubenswrapper[5184]: I0312 17:08:53.370482 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mvj44" Mar 12 17:08:53 crc kubenswrapper[5184]: I0312 17:08:53.370502 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mvj44" event={"ID":"a9bd8488-49bb-48df-8f41-f415f71a2834","Type":"ContainerDied","Data":"a9a8b329023b33530391e4abfb62ee455c571e9095db2e15d8ba23f3dcd5ebca"} Mar 12 17:08:53 crc kubenswrapper[5184]: I0312 17:08:53.370705 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a8b329023b33530391e4abfb62ee455c571e9095db2e15d8ba23f3dcd5ebca" Mar 12 17:08:53 crc kubenswrapper[5184]: I0312 17:08:53.474649 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-f484d5cc6-qld48"] Mar 12 17:08:53 crc kubenswrapper[5184]: I0312 17:08:53.475637 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9bd8488-49bb-48df-8f41-f415f71a2834" containerName="placement-db-sync" Mar 12 17:08:53 crc kubenswrapper[5184]: I0312 17:08:53.475655 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9bd8488-49bb-48df-8f41-f415f71a2834" containerName="placement-db-sync" Mar 12 17:08:53 crc kubenswrapper[5184]: I0312 17:08:53.475878 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9bd8488-49bb-48df-8f41-f415f71a2834" containerName="placement-db-sync" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.426661 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dplgw" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="registry-server" probeResult="failure" output=< Mar 12 17:08:56 crc kubenswrapper[5184]: timeout: failed to connect service ":50051" within 1s Mar 12 17:08:56 crc kubenswrapper[5184]: > Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.613399 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.621114 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-placement-public-svc\"" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.621289 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-placement-dockercfg-rp5f5\"" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.621420 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-placement-internal-svc\"" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.621707 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-config-data\"" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.628315 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f484d5cc6-qld48"] Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.628359 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.628391 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.628404 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.628458 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.628470 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.628480 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.628491 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.628528 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.628541 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.632730 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"placement-scripts\"" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.760197 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69f4d98b5f-9pzj8"] Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.760463 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" podUID="2729f29f-9520-421a-bb45-917ca4cef6fc" containerName="dnsmasq-dns" containerID="cri-o://ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb" gracePeriod=10 Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.767161 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-internal-tls-certs\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.767250 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9118cf9a-dccb-4e2b-8438-de0d717382a1-logs\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.768870 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-scripts\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.768979 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmtxn\" (UniqueName: \"kubernetes.io/projected/9118cf9a-dccb-4e2b-8438-de0d717382a1-kube-api-access-kmtxn\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.769089 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-combined-ca-bundle\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.769625 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-public-tls-certs\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.769835 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-config-data\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.873328 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-scripts\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.873401 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmtxn\" (UniqueName: \"kubernetes.io/projected/9118cf9a-dccb-4e2b-8438-de0d717382a1-kube-api-access-kmtxn\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.873429 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-combined-ca-bundle\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.873454 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-public-tls-certs\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.873493 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-config-data\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.873531 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-internal-tls-certs\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.873564 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9118cf9a-dccb-4e2b-8438-de0d717382a1-logs\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.873955 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9118cf9a-dccb-4e2b-8438-de0d717382a1-logs\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.893988 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-scripts\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.894112 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-public-tls-certs\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.894549 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-combined-ca-bundle\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.894660 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-internal-tls-certs\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.900125 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-config-data\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.901972 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmtxn\" (UniqueName: \"kubernetes.io/projected/9118cf9a-dccb-4e2b-8438-de0d717382a1-kube-api-access-kmtxn\") pod \"placement-f484d5cc6-qld48\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:56 crc kubenswrapper[5184]: I0312 17:08:56.938699 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.369490 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.386707 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-config\") pod \"2729f29f-9520-421a-bb45-917ca4cef6fc\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.386810 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4vlh\" (UniqueName: \"kubernetes.io/projected/2729f29f-9520-421a-bb45-917ca4cef6fc-kube-api-access-g4vlh\") pod \"2729f29f-9520-421a-bb45-917ca4cef6fc\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.386857 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-svc\") pod \"2729f29f-9520-421a-bb45-917ca4cef6fc\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.386893 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-swift-storage-0\") pod \"2729f29f-9520-421a-bb45-917ca4cef6fc\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.386942 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-nb\") pod \"2729f29f-9520-421a-bb45-917ca4cef6fc\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.387004 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-sb\") pod \"2729f29f-9520-421a-bb45-917ca4cef6fc\" (UID: \"2729f29f-9520-421a-bb45-917ca4cef6fc\") " Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.421223 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2729f29f-9520-421a-bb45-917ca4cef6fc-kube-api-access-g4vlh" (OuterVolumeSpecName: "kube-api-access-g4vlh") pod "2729f29f-9520-421a-bb45-917ca4cef6fc" (UID: "2729f29f-9520-421a-bb45-917ca4cef6fc"). InnerVolumeSpecName "kube-api-access-g4vlh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.467744 5184 generic.go:358] "Generic (PLEG): container finished" podID="2729f29f-9520-421a-bb45-917ca4cef6fc" containerID="ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb" exitCode=0 Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.478245 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2729f29f-9520-421a-bb45-917ca4cef6fc" (UID: "2729f29f-9520-421a-bb45-917ca4cef6fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.468754 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" event={"ID":"2729f29f-9520-421a-bb45-917ca4cef6fc","Type":"ContainerDied","Data":"ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb"} Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.480153 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" event={"ID":"2729f29f-9520-421a-bb45-917ca4cef6fc","Type":"ContainerDied","Data":"df0de2ae8b3120425039869498bfc49ae780249e053bedffacd1f85bbe62c106"} Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.480186 5184 scope.go:117] "RemoveContainer" containerID="ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.468721 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f4d98b5f-9pzj8" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.497094 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g4vlh\" (UniqueName: \"kubernetes.io/projected/2729f29f-9520-421a-bb45-917ca4cef6fc-kube-api-access-g4vlh\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.497134 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.551569 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-z62x6" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="registry-server" probeResult="failure" output=< Mar 12 17:08:57 crc kubenswrapper[5184]: timeout: failed to connect service ":50051" within 1s Mar 12 17:08:57 crc kubenswrapper[5184]: > Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.552021 5184 scope.go:117] "RemoveContainer" containerID="97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.553005 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-gzjgn" podStartSLOduration=12.936039381 podStartE2EDuration="43.552995759s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="2026-03-12 17:08:15.903634599 +0000 UTC m=+1038.444945938" lastFinishedPulling="2026-03-12 17:08:46.520590987 +0000 UTC m=+1069.061902316" observedRunningTime="2026-03-12 17:08:57.508536393 +0000 UTC m=+1080.049847742" watchObservedRunningTime="2026-03-12 17:08:57.552995759 +0000 UTC m=+1080.094307098" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.556231 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2729f29f-9520-421a-bb45-917ca4cef6fc" (UID: "2729f29f-9520-421a-bb45-917ca4cef6fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.574477 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2729f29f-9520-421a-bb45-917ca4cef6fc" (UID: "2729f29f-9520-421a-bb45-917ca4cef6fc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.577681 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-config" (OuterVolumeSpecName: "config") pod "2729f29f-9520-421a-bb45-917ca4cef6fc" (UID: "2729f29f-9520-421a-bb45-917ca4cef6fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.582725 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2729f29f-9520-421a-bb45-917ca4cef6fc" (UID: "2729f29f-9520-421a-bb45-917ca4cef6fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.595260 5184 scope.go:117] "RemoveContainer" containerID="ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb" Mar 12 17:08:57 crc kubenswrapper[5184]: E0312 17:08:57.596937 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb\": container with ID starting with ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb not found: ID does not exist" containerID="ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.596963 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb"} err="failed to get container status \"ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb\": rpc error: code = NotFound desc = could not find container \"ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb\": container with ID starting with ebc1e03ddf55b0d2a0149b1d2ba38453c0e971640228be24b3909b7513f29acb not found: ID does not exist" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.596992 5184 scope.go:117] "RemoveContainer" containerID="97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494" Mar 12 17:08:57 crc kubenswrapper[5184]: E0312 17:08:57.599164 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494\": container with ID starting with 97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494 not found: ID does not exist" containerID="97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.599187 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494"} err="failed to get container status \"97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494\": rpc error: code = NotFound desc = could not find container \"97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494\": container with ID starting with 97ae6c7c22c9a5a5c095a703d03f1be1a16a02d9a3e6765624efce7a8a873494 not found: ID does not exist" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.599369 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.599448 5184 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.599459 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.599469 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2729f29f-9520-421a-bb45-917ca4cef6fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.637282 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f484d5cc6-qld48"] Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.655562 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.810597 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69f4d98b5f-9pzj8"] Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.817798 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69f4d98b5f-9pzj8"] Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.923159 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:57 crc kubenswrapper[5184]: I0312 17:08:57.923402 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.431135 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2729f29f-9520-421a-bb45-917ca4cef6fc" path="/var/lib/kubelet/pods/2729f29f-9520-421a-bb45-917ca4cef6fc/volumes" Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.489057 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0","Type":"ContainerStarted","Data":"45116000968cef2b42520a056c24889472f5edb9976e94851ac0e2090c7465e3"} Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.500595 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f484d5cc6-qld48" event={"ID":"9118cf9a-dccb-4e2b-8438-de0d717382a1","Type":"ContainerStarted","Data":"8f196bf1e6f8f9e6a0cecfff73c46a82c3e10879c280a61ac0709a7b2882a57c"} Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.522976 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4e21b41b-8457-4bae-b2f8-fd29ea43334a","Type":"ContainerStarted","Data":"dbb55fb8a743e8dff6d965deea47c49db6ab448f11a8c0b6d42c46024a99beaa"} Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.527336 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.527316149 podStartE2EDuration="14.527316149s" podCreationTimestamp="2026-03-12 17:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:58.514082174 +0000 UTC m=+1081.055393513" watchObservedRunningTime="2026-03-12 17:08:58.527316149 +0000 UTC m=+1081.068627488" Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.531952 5184 generic.go:358] "Generic (PLEG): container finished" podID="c7d8d368-22c0-41a1-972c-d8f7c14db7b5" containerID="3c8cc2178ecb3ec4d8690b45d2bc35f21e9f8b1578a6c5175cf7f61befac352a" exitCode=0 Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.532115 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9n4jh" event={"ID":"c7d8d368-22c0-41a1-972c-d8f7c14db7b5","Type":"ContainerDied","Data":"3c8cc2178ecb3ec4d8690b45d2bc35f21e9f8b1578a6c5175cf7f61befac352a"} Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.534695 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b69fb49f7-dmgsd" event={"ID":"0765e7b4-b879-4989-8a31-486408b9cdce","Type":"ContainerStarted","Data":"57ce9289747a13102e2b8b4676fe53e2205ceb948f0161b7a051a28d12aeb302"} Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.535656 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.547067 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.547046189 podStartE2EDuration="14.547046189s" podCreationTimestamp="2026-03-12 17:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:58.540763382 +0000 UTC m=+1081.082074721" watchObservedRunningTime="2026-03-12 17:08:58.547046189 +0000 UTC m=+1081.088357528" Mar 12 17:08:58 crc kubenswrapper[5184]: I0312 17:08:58.592807 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b69fb49f7-dmgsd" podStartSLOduration=10.592783836 podStartE2EDuration="10.592783836s" podCreationTimestamp="2026-03-12 17:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:58.573248732 +0000 UTC m=+1081.114560081" watchObservedRunningTime="2026-03-12 17:08:58.592783836 +0000 UTC m=+1081.134095175" Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.547871 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f484d5cc6-qld48" event={"ID":"9118cf9a-dccb-4e2b-8438-de0d717382a1","Type":"ContainerStarted","Data":"f6620ab7ebe09dc3a4be223e1ac6ce9064ba8097dc3bd65a22531a714f802fab"} Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.548340 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f484d5cc6-qld48" event={"ID":"9118cf9a-dccb-4e2b-8438-de0d717382a1","Type":"ContainerStarted","Data":"9eb64916596a65399b125bdddf16d978075ffdcc78ce9370906ae117e986ae01"} Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.548358 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.548369 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.550975 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1","Type":"ContainerStarted","Data":"943e06caa8e5d76068f507ba5cf5ebb9180ad92638b32bfce25c70ab2ff245f1"} Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.579942 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f484d5cc6-qld48" podStartSLOduration=6.5799161680000005 podStartE2EDuration="6.579916168s" podCreationTimestamp="2026-03-12 17:08:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:08:59.570981378 +0000 UTC m=+1082.112292737" watchObservedRunningTime="2026-03-12 17:08:59.579916168 +0000 UTC m=+1082.121227517" Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.873287 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.950563 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-scripts\") pod \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.950730 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-combined-ca-bundle\") pod \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.950856 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-fernet-keys\") pod \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.952323 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-credential-keys\") pod \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.952468 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmr42\" (UniqueName: \"kubernetes.io/projected/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-kube-api-access-zmr42\") pod \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.952528 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-config-data\") pod \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\" (UID: \"c7d8d368-22c0-41a1-972c-d8f7c14db7b5\") " Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.959829 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-scripts" (OuterVolumeSpecName: "scripts") pod "c7d8d368-22c0-41a1-972c-d8f7c14db7b5" (UID: "c7d8d368-22c0-41a1-972c-d8f7c14db7b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.962836 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-kube-api-access-zmr42" (OuterVolumeSpecName: "kube-api-access-zmr42") pod "c7d8d368-22c0-41a1-972c-d8f7c14db7b5" (UID: "c7d8d368-22c0-41a1-972c-d8f7c14db7b5"). InnerVolumeSpecName "kube-api-access-zmr42". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.965295 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c7d8d368-22c0-41a1-972c-d8f7c14db7b5" (UID: "c7d8d368-22c0-41a1-972c-d8f7c14db7b5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.974436 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c7d8d368-22c0-41a1-972c-d8f7c14db7b5" (UID: "c7d8d368-22c0-41a1-972c-d8f7c14db7b5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:08:59 crc kubenswrapper[5184]: I0312 17:08:59.983544 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7d8d368-22c0-41a1-972c-d8f7c14db7b5" (UID: "c7d8d368-22c0-41a1-972c-d8f7c14db7b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.006472 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-config-data" (OuterVolumeSpecName: "config-data") pod "c7d8d368-22c0-41a1-972c-d8f7c14db7b5" (UID: "c7d8d368-22c0-41a1-972c-d8f7c14db7b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.058546 5184 reconciler_common.go:299] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.058865 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zmr42\" (UniqueName: \"kubernetes.io/projected/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-kube-api-access-zmr42\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.058882 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.058893 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.058904 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.058915 5184 reconciler_common.go:299] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c7d8d368-22c0-41a1-972c-d8f7c14db7b5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.568750 5184 generic.go:358] "Generic (PLEG): container finished" podID="d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c" containerID="a4f5d3d4c700f2d9a238c75ab41843bc64c0d8f971ab4ddd7ceced4d12313fac" exitCode=0 Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.568978 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jb66b" event={"ID":"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c","Type":"ContainerDied","Data":"a4f5d3d4c700f2d9a238c75ab41843bc64c0d8f971ab4ddd7ceced4d12313fac"} Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.573824 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9n4jh" event={"ID":"c7d8d368-22c0-41a1-972c-d8f7c14db7b5","Type":"ContainerDied","Data":"15710ab138fee64b5890ce8cb78ef2ea1e3312861df961e0914504080e5cf951"} Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.573904 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15710ab138fee64b5890ce8cb78ef2ea1e3312861df961e0914504080e5cf951" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.574692 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9n4jh" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.697706 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/keystone-698bbfd847-dsp75"] Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.699020 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d8d368-22c0-41a1-972c-d8f7c14db7b5" containerName="keystone-bootstrap" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.699049 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d8d368-22c0-41a1-972c-d8f7c14db7b5" containerName="keystone-bootstrap" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.699075 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2729f29f-9520-421a-bb45-917ca4cef6fc" containerName="dnsmasq-dns" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.699084 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2729f29f-9520-421a-bb45-917ca4cef6fc" containerName="dnsmasq-dns" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.699143 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2729f29f-9520-421a-bb45-917ca4cef6fc" containerName="init" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.699153 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2729f29f-9520-421a-bb45-917ca4cef6fc" containerName="init" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.699370 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7d8d368-22c0-41a1-972c-d8f7c14db7b5" containerName="keystone-bootstrap" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.699414 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="2729f29f-9520-421a-bb45-917ca4cef6fc" containerName="dnsmasq-dns" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.771866 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-698bbfd847-dsp75"] Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.772018 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.775322 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-scripts\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.778462 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-keystone-public-svc\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.778635 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-keystone-dockercfg-4s8pv\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.778775 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone-config-data\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.778909 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-keystone-internal-svc\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.779019 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"keystone\"" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.872683 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-combined-ca-bundle\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.872930 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-credential-keys\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.873048 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-config-data\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.873128 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-fernet-keys\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.873281 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szgq7\" (UniqueName: \"kubernetes.io/projected/3f2c8323-78ff-4f44-9741-b5564424b6c2-kube-api-access-szgq7\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.873357 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-public-tls-certs\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.873455 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-internal-tls-certs\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.873585 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-scripts\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.975450 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-scripts\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.975828 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-combined-ca-bundle\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.975859 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-credential-keys\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.975883 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-config-data\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.975897 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-fernet-keys\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.976432 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szgq7\" (UniqueName: \"kubernetes.io/projected/3f2c8323-78ff-4f44-9741-b5564424b6c2-kube-api-access-szgq7\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.976803 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-public-tls-certs\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.976827 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-internal-tls-certs\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.983118 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-fernet-keys\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.985618 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-internal-tls-certs\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.987001 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-scripts\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.987020 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-combined-ca-bundle\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.995100 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-credential-keys\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.995645 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-config-data\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:00 crc kubenswrapper[5184]: I0312 17:09:00.998906 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2c8323-78ff-4f44-9741-b5564424b6c2-public-tls-certs\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:01 crc kubenswrapper[5184]: I0312 17:09:01.003141 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szgq7\" (UniqueName: \"kubernetes.io/projected/3f2c8323-78ff-4f44-9741-b5564424b6c2-kube-api-access-szgq7\") pod \"keystone-698bbfd847-dsp75\" (UID: \"3f2c8323-78ff-4f44-9741-b5564424b6c2\") " pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:01 crc kubenswrapper[5184]: I0312 17:09:01.181545 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.044739 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-698bbfd847-dsp75"] Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.259583 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jb66b" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.449834 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-db-sync-config-data\") pod \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.450235 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wntmd\" (UniqueName: \"kubernetes.io/projected/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-kube-api-access-wntmd\") pod \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.450699 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-combined-ca-bundle\") pod \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\" (UID: \"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c\") " Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.674856 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jb66b" event={"ID":"d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c","Type":"ContainerDied","Data":"392ea1029e9ff9d30e6ffb02723494350983ddbdde6b1005a78691790fee798e"} Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.674888 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="392ea1029e9ff9d30e6ffb02723494350983ddbdde6b1005a78691790fee798e" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.674948 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jb66b" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.676018 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-698bbfd847-dsp75" event={"ID":"3f2c8323-78ff-4f44-9741-b5564424b6c2","Type":"ContainerStarted","Data":"32309a6b963f6c21d848165d0d17194e6d5acd31a182634024ac68f430268d39"} Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.719428 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-kube-api-access-wntmd" (OuterVolumeSpecName: "kube-api-access-wntmd") pod "d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c" (UID: "d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c"). InnerVolumeSpecName "kube-api-access-wntmd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.725209 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c" (UID: "d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.725521 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c" (UID: "d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.821567 5184 reconciler_common.go:299] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.821610 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wntmd\" (UniqueName: \"kubernetes.io/projected/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-kube-api-access-wntmd\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.821625 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.988708 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-84f8b689b8-ptbnb"] Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.990114 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c" containerName="barbican-db-sync" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.990132 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c" containerName="barbican-db-sync" Mar 12 17:09:02 crc kubenswrapper[5184]: I0312 17:09:02.990332 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c" containerName="barbican-db-sync" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.016366 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-84f8b689b8-ptbnb"] Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.018040 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.022112 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-keystone-listener-config-data\"" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.022901 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-barbican-dockercfg-gsb8b\"" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.023040 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-config-data\"" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.024864 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe68657-8277-4927-9fef-88807e21461b-config-data\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.024909 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f5g9\" (UniqueName: \"kubernetes.io/projected/dfe68657-8277-4927-9fef-88807e21461b-kube-api-access-9f5g9\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.024990 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfe68657-8277-4927-9fef-88807e21461b-logs\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.025048 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe68657-8277-4927-9fef-88807e21461b-combined-ca-bundle\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.025077 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe68657-8277-4927-9fef-88807e21461b-config-data-custom\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.080297 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-574685bb47-lxzsh"] Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.087101 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.089749 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-worker-config-data\"" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.114852 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-574685bb47-lxzsh"] Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.143176 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55402375-344d-4d12-b750-7a3e784b3886-config-data\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.143256 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55402375-344d-4d12-b750-7a3e784b3886-combined-ca-bundle\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.143343 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe68657-8277-4927-9fef-88807e21461b-combined-ca-bundle\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.143447 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe68657-8277-4927-9fef-88807e21461b-config-data-custom\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.143519 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe68657-8277-4927-9fef-88807e21461b-config-data\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.144196 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55402375-344d-4d12-b750-7a3e784b3886-logs\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.144308 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9f5g9\" (UniqueName: \"kubernetes.io/projected/dfe68657-8277-4927-9fef-88807e21461b-kube-api-access-9f5g9\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.144425 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55402375-344d-4d12-b750-7a3e784b3886-config-data-custom\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.145194 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfe68657-8277-4927-9fef-88807e21461b-logs\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.145233 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkb2t\" (UniqueName: \"kubernetes.io/projected/55402375-344d-4d12-b750-7a3e784b3886-kube-api-access-xkb2t\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.150988 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfe68657-8277-4927-9fef-88807e21461b-logs\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.178436 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe68657-8277-4927-9fef-88807e21461b-config-data-custom\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.179604 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f5g9\" (UniqueName: \"kubernetes.io/projected/dfe68657-8277-4927-9fef-88807e21461b-kube-api-access-9f5g9\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.219323 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe68657-8277-4927-9fef-88807e21461b-config-data\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.226491 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77cdfb9675-lcv79"] Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.244018 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe68657-8277-4927-9fef-88807e21461b-combined-ca-bundle\") pod \"barbican-keystone-listener-84f8b689b8-ptbnb\" (UID: \"dfe68657-8277-4927-9fef-88807e21461b\") " pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.252315 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55402375-344d-4d12-b750-7a3e784b3886-config-data-custom\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.254812 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkb2t\" (UniqueName: \"kubernetes.io/projected/55402375-344d-4d12-b750-7a3e784b3886-kube-api-access-xkb2t\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.255081 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55402375-344d-4d12-b750-7a3e784b3886-config-data\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.255216 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55402375-344d-4d12-b750-7a3e784b3886-combined-ca-bundle\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.255499 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55402375-344d-4d12-b750-7a3e784b3886-logs\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.256155 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55402375-344d-4d12-b750-7a3e784b3886-logs\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.273150 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55402375-344d-4d12-b750-7a3e784b3886-config-data\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.283779 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55402375-344d-4d12-b750-7a3e784b3886-config-data-custom\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.287162 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55402375-344d-4d12-b750-7a3e784b3886-combined-ca-bundle\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.300972 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkb2t\" (UniqueName: \"kubernetes.io/projected/55402375-344d-4d12-b750-7a3e784b3886-kube-api-access-xkb2t\") pod \"barbican-worker-574685bb47-lxzsh\" (UID: \"55402375-344d-4d12-b750-7a3e784b3886\") " pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.301191 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77cdfb9675-lcv79"] Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.302910 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.331225 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5865f6c4b6-r8frb"] Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.367917 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-config\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.368006 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-sb\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.368050 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-nb\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.368081 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2h5j\" (UniqueName: \"kubernetes.io/projected/da615b33-3bf7-4b28-b95f-5c45f00679cb-kube-api-access-l2h5j\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.368125 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-swift-storage-0\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.368196 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-svc\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.378856 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5865f6c4b6-r8frb"] Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.379050 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.386083 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"barbican-api-config-data\"" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.395472 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.409771 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-574685bb47-lxzsh" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.472681 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-config\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.473248 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-combined-ca-bundle\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.473329 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-sb\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.473363 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-nb\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.473396 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2h5j\" (UniqueName: \"kubernetes.io/projected/da615b33-3bf7-4b28-b95f-5c45f00679cb-kube-api-access-l2h5j\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.473419 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ce447b-3a61-4c39-9f58-12985dbdb754-logs\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.473455 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-swift-storage-0\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.473485 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.473517 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgt9x\" (UniqueName: \"kubernetes.io/projected/f0ce447b-3a61-4c39-9f58-12985dbdb754-kube-api-access-bgt9x\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.474230 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-config\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.474444 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-swift-storage-0\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.474684 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-svc\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.474704 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data-custom\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.474880 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-sb\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.476128 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-nb\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.487646 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-svc\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.512053 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2h5j\" (UniqueName: \"kubernetes.io/projected/da615b33-3bf7-4b28-b95f-5c45f00679cb-kube-api-access-l2h5j\") pod \"dnsmasq-dns-77cdfb9675-lcv79\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.576904 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-combined-ca-bundle\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.577043 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ce447b-3a61-4c39-9f58-12985dbdb754-logs\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.577083 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.577123 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgt9x\" (UniqueName: \"kubernetes.io/projected/f0ce447b-3a61-4c39-9f58-12985dbdb754-kube-api-access-bgt9x\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.577157 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data-custom\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.578265 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ce447b-3a61-4c39-9f58-12985dbdb754-logs\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.582587 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.594608 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data-custom\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.601008 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-combined-ca-bundle\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.619220 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgt9x\" (UniqueName: \"kubernetes.io/projected/f0ce447b-3a61-4c39-9f58-12985dbdb754-kube-api-access-bgt9x\") pod \"barbican-api-5865f6c4b6-r8frb\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.731042 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.733828 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-698bbfd847-dsp75" event={"ID":"3f2c8323-78ff-4f44-9741-b5564424b6c2","Type":"ContainerStarted","Data":"98caf84b33f5b867f9730822572387c2f8ec747b56a9373999012f4b2c706888"} Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.739495 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.741307 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:03 crc kubenswrapper[5184]: I0312 17:09:03.788248 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-698bbfd847-dsp75" podStartSLOduration=3.788225067 podStartE2EDuration="3.788225067s" podCreationTimestamp="2026-03-12 17:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:09:03.759869796 +0000 UTC m=+1086.301181145" watchObservedRunningTime="2026-03-12 17:09:03.788225067 +0000 UTC m=+1086.329536406" Mar 12 17:09:04 crc kubenswrapper[5184]: I0312 17:09:04.196507 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-574685bb47-lxzsh"] Mar 12 17:09:04 crc kubenswrapper[5184]: I0312 17:09:04.217998 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-84f8b689b8-ptbnb"] Mar 12 17:09:04 crc kubenswrapper[5184]: I0312 17:09:04.657188 5184 scope.go:117] "RemoveContainer" containerID="af482bb2174d12ea7babb868743ac582cb8e3bc181395e34bde3fe0d1712213f" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.009183 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.010504 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.010618 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.011568 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.076486 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.077735 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.079438 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.106485 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.787660 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.788003 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.788016 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.788026 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Mar 12 17:09:05 crc kubenswrapper[5184]: I0312 17:09:05.812176 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/horizon-859ddbd78-2m2xk" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.234535 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-66c64686b6-kwvcj"] Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.253662 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.259814 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-barbican-internal-svc\"" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.259873 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-barbican-public-svc\"" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.312130 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66c64686b6-kwvcj"] Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.352431 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-config-data\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.352515 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-internal-tls-certs\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.352641 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-config-data-custom\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.352775 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-combined-ca-bundle\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.352866 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmt5t\" (UniqueName: \"kubernetes.io/projected/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-kube-api-access-pmt5t\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.353008 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-logs\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.353040 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-public-tls-certs\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.442012 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dplgw" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="registry-server" probeResult="failure" output=< Mar 12 17:09:06 crc kubenswrapper[5184]: timeout: failed to connect service ":50051" within 1s Mar 12 17:09:06 crc kubenswrapper[5184]: > Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.455636 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-combined-ca-bundle\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.457055 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmt5t\" (UniqueName: \"kubernetes.io/projected/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-kube-api-access-pmt5t\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.457235 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-logs\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.457268 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-public-tls-certs\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.457444 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-config-data\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.457465 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-internal-tls-certs\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.457560 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-config-data-custom\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.459790 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-logs\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.468564 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-internal-tls-certs\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.473709 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-combined-ca-bundle\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.475077 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmt5t\" (UniqueName: \"kubernetes.io/projected/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-kube-api-access-pmt5t\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.475332 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-config-data-custom\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.476936 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-public-tls-certs\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.487348 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/916be5af-0ed0-4d16-a8b8-2d01b7b81dab-config-data\") pod \"barbican-api-66c64686b6-kwvcj\" (UID: \"916be5af-0ed0-4d16-a8b8-2d01b7b81dab\") " pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.612129 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.798477 5184 generic.go:358] "Generic (PLEG): container finished" podID="7ed3fdf4-b869-4ef8-b746-afe1e52fe286" containerID="45049197c84547cb9b08115ad2142cab8b478837d30511a82dba357efd5b8ef0" exitCode=0 Mar 12 17:09:06 crc kubenswrapper[5184]: I0312 17:09:06.798572 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gzjgn" event={"ID":"7ed3fdf4-b869-4ef8-b746-afe1e52fe286","Type":"ContainerDied","Data":"45049197c84547cb9b08115ad2142cab8b478837d30511a82dba357efd5b8ef0"} Mar 12 17:09:07 crc kubenswrapper[5184]: I0312 17:09:07.549267 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-z62x6" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="registry-server" probeResult="failure" output=< Mar 12 17:09:07 crc kubenswrapper[5184]: timeout: failed to connect service ":50051" within 1s Mar 12 17:09:07 crc kubenswrapper[5184]: > Mar 12 17:09:07 crc kubenswrapper[5184]: I0312 17:09:07.808962 5184 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 17:09:07 crc kubenswrapper[5184]: I0312 17:09:07.809246 5184 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 17:09:09 crc kubenswrapper[5184]: I0312 17:09:09.759760 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/horizon-7cd5c99b94-hgvbf" podUID="a97bd24c-a292-45a7-af77-526fb65b807d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 12 17:09:11 crc kubenswrapper[5184]: W0312 17:09:11.798980 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfe68657_8277_4927_9fef_88807e21461b.slice/crio-39a82098da5cee069fe65175427cdaf94662cff56a6530b433025c8a63fb4a18 WatchSource:0}: Error finding container 39a82098da5cee069fe65175427cdaf94662cff56a6530b433025c8a63fb4a18: Status 404 returned error can't find the container with id 39a82098da5cee069fe65175427cdaf94662cff56a6530b433025c8a63fb4a18 Mar 12 17:09:11 crc kubenswrapper[5184]: W0312 17:09:11.818531 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55402375_344d_4d12_b750_7a3e784b3886.slice/crio-6e51823eeed79578b50bd1f746be77a50d39c95f73fa6e827dd5de261fae2096 WatchSource:0}: Error finding container 6e51823eeed79578b50bd1f746be77a50d39c95f73fa6e827dd5de261fae2096: Status 404 returned error can't find the container with id 6e51823eeed79578b50bd1f746be77a50d39c95f73fa6e827dd5de261fae2096 Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.838912 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.839268 5184 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.845073 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.881983 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.882119 5184 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.882672 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.885905 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" event={"ID":"dfe68657-8277-4927-9fef-88807e21461b","Type":"ContainerStarted","Data":"39a82098da5cee069fe65175427cdaf94662cff56a6530b433025c8a63fb4a18"} Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.892790 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-574685bb47-lxzsh" event={"ID":"55402375-344d-4d12-b750-7a3e784b3886","Type":"ContainerStarted","Data":"6e51823eeed79578b50bd1f746be77a50d39c95f73fa6e827dd5de261fae2096"} Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.939865 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.947649 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gzjgn" event={"ID":"7ed3fdf4-b869-4ef8-b746-afe1e52fe286","Type":"ContainerDied","Data":"9e1f2f35c99cea5d95fefacc3828695859439226470ffb365ced413e6f507687"} Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.947685 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e1f2f35c99cea5d95fefacc3828695859439226470ffb365ced413e6f507687" Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.986208 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-config-data\") pod \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.986259 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-combined-ca-bundle\") pod \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.986446 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-etc-machine-id\") pod \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.986487 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-scripts\") pod \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.986667 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8mz5\" (UniqueName: \"kubernetes.io/projected/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-kube-api-access-s8mz5\") pod \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.986740 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-db-sync-config-data\") pod \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\" (UID: \"7ed3fdf4-b869-4ef8-b746-afe1e52fe286\") " Mar 12 17:09:11 crc kubenswrapper[5184]: I0312 17:09:11.996956 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7ed3fdf4-b869-4ef8-b746-afe1e52fe286" (UID: "7ed3fdf4-b869-4ef8-b746-afe1e52fe286"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.016021 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-scripts" (OuterVolumeSpecName: "scripts") pod "7ed3fdf4-b869-4ef8-b746-afe1e52fe286" (UID: "7ed3fdf4-b869-4ef8-b746-afe1e52fe286"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.026053 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7ed3fdf4-b869-4ef8-b746-afe1e52fe286" (UID: "7ed3fdf4-b869-4ef8-b746-afe1e52fe286"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.026226 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-kube-api-access-s8mz5" (OuterVolumeSpecName: "kube-api-access-s8mz5") pod "7ed3fdf4-b869-4ef8-b746-afe1e52fe286" (UID: "7ed3fdf4-b869-4ef8-b746-afe1e52fe286"). InnerVolumeSpecName "kube-api-access-s8mz5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.046572 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ed3fdf4-b869-4ef8-b746-afe1e52fe286" (UID: "7ed3fdf4-b869-4ef8-b746-afe1e52fe286"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.089409 5184 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.089473 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.089487 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s8mz5\" (UniqueName: \"kubernetes.io/projected/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-kube-api-access-s8mz5\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.089500 5184 reconciler_common.go:299] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.089509 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.128504 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-config-data" (OuterVolumeSpecName: "config-data") pod "7ed3fdf4-b869-4ef8-b746-afe1e52fe286" (UID: "7ed3fdf4-b869-4ef8-b746-afe1e52fe286"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.191651 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed3fdf4-b869-4ef8-b746-afe1e52fe286-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:12 crc kubenswrapper[5184]: I0312 17:09:12.955617 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gzjgn" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.219466 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.220524 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ed3fdf4-b869-4ef8-b746-afe1e52fe286" containerName="cinder-db-sync" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.220539 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed3fdf4-b869-4ef8-b746-afe1e52fe286" containerName="cinder-db-sync" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.220746 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ed3fdf4-b869-4ef8-b746-afe1e52fe286" containerName="cinder-db-sync" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.236602 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.236809 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.243830 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-cinder-dockercfg-h494p\"" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.244112 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-config-data\"" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.244268 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scripts\"" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.244450 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scheduler-config-data\"" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.320044 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.320168 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.320220 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/001b7f9f-058c-4037-af26-b94505164a68-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.320243 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.320266 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-scripts\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.320291 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkrp4\" (UniqueName: \"kubernetes.io/projected/001b7f9f-058c-4037-af26-b94505164a68-kube-api-access-kkrp4\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.339118 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cdfb9675-lcv79"] Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.382459 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9fc87f-mb797"] Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.423442 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/001b7f9f-058c-4037-af26-b94505164a68-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.423496 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.423527 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-scripts\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.423554 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkrp4\" (UniqueName: \"kubernetes.io/projected/001b7f9f-058c-4037-af26-b94505164a68-kube-api-access-kkrp4\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.423594 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.423698 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.424660 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/001b7f9f-058c-4037-af26-b94505164a68-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.428570 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.429436 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.433102 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.441418 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.475272 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-scripts\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.481086 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fc87f-mb797"] Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.481138 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.481618 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.483535 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.494200 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkrp4\" (UniqueName: \"kubernetes.io/projected/001b7f9f-058c-4037-af26-b94505164a68-kube-api-access-kkrp4\") pod \"cinder-scheduler-0\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.494389 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-api-config-data\"" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525533 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525606 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525652 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246d17d3-b07a-4fe4-8165-711bcd72517f-logs\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525707 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-svc\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525736 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-swift-storage-0\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525750 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/246d17d3-b07a-4fe4-8165-711bcd72517f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525766 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcnq2\" (UniqueName: \"kubernetes.io/projected/246d17d3-b07a-4fe4-8165-711bcd72517f-kube-api-access-rcnq2\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525805 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-sb\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525839 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-scripts\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525863 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-nb\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525894 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data-custom\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.525937 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnt7x\" (UniqueName: \"kubernetes.io/projected/7d33c92d-847e-48b5-bb1b-a8defe0756f7-kube-api-access-bnt7x\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.526003 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-config\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.571919 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627397 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-scripts\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627445 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-nb\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627472 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data-custom\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627497 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnt7x\" (UniqueName: \"kubernetes.io/projected/7d33c92d-847e-48b5-bb1b-a8defe0756f7-kube-api-access-bnt7x\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627534 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-config\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627586 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627604 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627632 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246d17d3-b07a-4fe4-8165-711bcd72517f-logs\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627676 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-svc\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627700 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-swift-storage-0\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627714 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/246d17d3-b07a-4fe4-8165-711bcd72517f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627756 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcnq2\" (UniqueName: \"kubernetes.io/projected/246d17d3-b07a-4fe4-8165-711bcd72517f-kube-api-access-rcnq2\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.627799 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-sb\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.628958 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-sb\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.634474 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-scripts\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.634515 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-nb\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.634989 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246d17d3-b07a-4fe4-8165-711bcd72517f-logs\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.635099 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-config\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.635424 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/246d17d3-b07a-4fe4-8165-711bcd72517f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.636347 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-swift-storage-0\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.640780 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data-custom\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.642302 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-svc\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.644639 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.651117 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.657905 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcnq2\" (UniqueName: \"kubernetes.io/projected/246d17d3-b07a-4fe4-8165-711bcd72517f-kube-api-access-rcnq2\") pod \"cinder-api-0\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " pod="openstack/cinder-api-0" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.661239 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnt7x\" (UniqueName: \"kubernetes.io/projected/7d33c92d-847e-48b5-bb1b-a8defe0756f7-kube-api-access-bnt7x\") pod \"dnsmasq-dns-9fc87f-mb797\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.876794 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:13 crc kubenswrapper[5184]: I0312 17:09:13.893013 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 17:09:15 crc kubenswrapper[5184]: I0312 17:09:15.808969 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/horizon-859ddbd78-2m2xk" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 12 17:09:16 crc kubenswrapper[5184]: I0312 17:09:16.149455 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 12 17:09:16 crc kubenswrapper[5184]: I0312 17:09:16.432108 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dplgw" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="registry-server" probeResult="failure" output=< Mar 12 17:09:16 crc kubenswrapper[5184]: timeout: failed to connect service ":50051" within 1s Mar 12 17:09:16 crc kubenswrapper[5184]: > Mar 12 17:09:16 crc kubenswrapper[5184]: I0312 17:09:16.460095 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-66c64686b6-kwvcj"] Mar 12 17:09:16 crc kubenswrapper[5184]: I0312 17:09:16.741682 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5865f6c4b6-r8frb"] Mar 12 17:09:16 crc kubenswrapper[5184]: I0312 17:09:16.937697 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cdfb9675-lcv79"] Mar 12 17:09:16 crc kubenswrapper[5184]: I0312 17:09:16.995597 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1","Type":"ContainerStarted","Data":"95f938870d1a66767fdd5c7147c287517381b4b5d229780159a9e039c953ef07"} Mar 12 17:09:16 crc kubenswrapper[5184]: I0312 17:09:16.996891 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5865f6c4b6-r8frb" event={"ID":"f0ce447b-3a61-4c39-9f58-12985dbdb754","Type":"ContainerStarted","Data":"6d31915f799213be7395d1bcdcd590a621310c15db9b6ec5a08aad01b18b8a17"} Mar 12 17:09:16 crc kubenswrapper[5184]: I0312 17:09:16.997971 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66c64686b6-kwvcj" event={"ID":"916be5af-0ed0-4d16-a8b8-2d01b7b81dab","Type":"ContainerStarted","Data":"bd23fef0fa1c382fb0c12ab3e489e6d7871adacce7cfeeca1694e704c6b21e89"} Mar 12 17:09:16 crc kubenswrapper[5184]: I0312 17:09:16.997991 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66c64686b6-kwvcj" event={"ID":"916be5af-0ed0-4d16-a8b8-2d01b7b81dab","Type":"ContainerStarted","Data":"7c660ab08c2443ac51ec2acdba548511c80dd99234da3344844f1d9c8ee10c7c"} Mar 12 17:09:17 crc kubenswrapper[5184]: I0312 17:09:17.215092 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 17:09:17 crc kubenswrapper[5184]: I0312 17:09:17.261302 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fc87f-mb797"] Mar 12 17:09:17 crc kubenswrapper[5184]: I0312 17:09:17.410653 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 12 17:09:17 crc kubenswrapper[5184]: I0312 17:09:17.545517 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-z62x6" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="registry-server" probeResult="failure" output=< Mar 12 17:09:17 crc kubenswrapper[5184]: timeout: failed to connect service ":50051" within 1s Mar 12 17:09:17 crc kubenswrapper[5184]: > Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.025024 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"001b7f9f-058c-4037-af26-b94505164a68","Type":"ContainerStarted","Data":"658ac3911f7f4392f677acb46f061cdbdbba7cb6b828648abc255de57688b3be"} Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.031586 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-574685bb47-lxzsh" event={"ID":"55402375-344d-4d12-b750-7a3e784b3886","Type":"ContainerStarted","Data":"88b6aa4a6f1bb7569a18f90c826fc688f092d320b27e2d69594251215d18aa96"} Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.035493 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"246d17d3-b07a-4fe4-8165-711bcd72517f","Type":"ContainerStarted","Data":"c2bc4fa4369b96c3b7ec978f87793e86b1dce86caad031faa407b88f5f1f60f3"} Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.038349 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" event={"ID":"da615b33-3bf7-4b28-b95f-5c45f00679cb","Type":"ContainerStarted","Data":"699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf"} Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.038610 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" event={"ID":"da615b33-3bf7-4b28-b95f-5c45f00679cb","Type":"ContainerStarted","Data":"2913d1ae42fa93d9db23f7e84112804fbb04dacff5d8083808a00a1c849ce13b"} Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.038814 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" podUID="da615b33-3bf7-4b28-b95f-5c45f00679cb" containerName="init" containerID="cri-o://699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf" gracePeriod=10 Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.043594 5184 generic.go:358] "Generic (PLEG): container finished" podID="7d33c92d-847e-48b5-bb1b-a8defe0756f7" containerID="26910bc0aabc5a2a8df0698e2a81e502134cee7215f2181eaa42086f4f0b5a54" exitCode=0 Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.043695 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc87f-mb797" event={"ID":"7d33c92d-847e-48b5-bb1b-a8defe0756f7","Type":"ContainerDied","Data":"26910bc0aabc5a2a8df0698e2a81e502134cee7215f2181eaa42086f4f0b5a54"} Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.043728 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc87f-mb797" event={"ID":"7d33c92d-847e-48b5-bb1b-a8defe0756f7","Type":"ContainerStarted","Data":"7f3cfe57d238fc767c2be16e2b9f20a80cb15cd0418b8f26048d3fb0b2127c52"} Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.061465 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5865f6c4b6-r8frb" event={"ID":"f0ce447b-3a61-4c39-9f58-12985dbdb754","Type":"ContainerStarted","Data":"1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026"} Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.061513 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5865f6c4b6-r8frb" event={"ID":"f0ce447b-3a61-4c39-9f58-12985dbdb754","Type":"ContainerStarted","Data":"46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e"} Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.062407 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.062933 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.071678 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-66c64686b6-kwvcj" event={"ID":"916be5af-0ed0-4d16-a8b8-2d01b7b81dab","Type":"ContainerStarted","Data":"4283802248ef32252f0e28ddd8c5fb837bc90059f4feaf46f79bbf912622358b"} Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.071736 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.071748 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.127008 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5865f6c4b6-r8frb" podStartSLOduration=16.126988688 podStartE2EDuration="16.126988688s" podCreationTimestamp="2026-03-12 17:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:09:19.097196662 +0000 UTC m=+1101.638508001" watchObservedRunningTime="2026-03-12 17:09:19.126988688 +0000 UTC m=+1101.668300027" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.130753 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-66c64686b6-kwvcj" podStartSLOduration=13.130741996 podStartE2EDuration="13.130741996s" podCreationTimestamp="2026-03-12 17:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:09:19.120001749 +0000 UTC m=+1101.661313098" watchObservedRunningTime="2026-03-12 17:09:19.130741996 +0000 UTC m=+1101.672053335" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.557028 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.695189 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-nb\") pod \"da615b33-3bf7-4b28-b95f-5c45f00679cb\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.695303 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-sb\") pod \"da615b33-3bf7-4b28-b95f-5c45f00679cb\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.695428 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-swift-storage-0\") pod \"da615b33-3bf7-4b28-b95f-5c45f00679cb\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.695497 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-config\") pod \"da615b33-3bf7-4b28-b95f-5c45f00679cb\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.695657 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2h5j\" (UniqueName: \"kubernetes.io/projected/da615b33-3bf7-4b28-b95f-5c45f00679cb-kube-api-access-l2h5j\") pod \"da615b33-3bf7-4b28-b95f-5c45f00679cb\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.695742 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-svc\") pod \"da615b33-3bf7-4b28-b95f-5c45f00679cb\" (UID: \"da615b33-3bf7-4b28-b95f-5c45f00679cb\") " Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.721663 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da615b33-3bf7-4b28-b95f-5c45f00679cb-kube-api-access-l2h5j" (OuterVolumeSpecName: "kube-api-access-l2h5j") pod "da615b33-3bf7-4b28-b95f-5c45f00679cb" (UID: "da615b33-3bf7-4b28-b95f-5c45f00679cb"). InnerVolumeSpecName "kube-api-access-l2h5j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.798145 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2h5j\" (UniqueName: \"kubernetes.io/projected/da615b33-3bf7-4b28-b95f-5c45f00679cb-kube-api-access-l2h5j\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.823105 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.840163 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.842193 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da615b33-3bf7-4b28-b95f-5c45f00679cb" (UID: "da615b33-3bf7-4b28-b95f-5c45f00679cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.873779 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da615b33-3bf7-4b28-b95f-5c45f00679cb" (UID: "da615b33-3bf7-4b28-b95f-5c45f00679cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.874946 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da615b33-3bf7-4b28-b95f-5c45f00679cb" (UID: "da615b33-3bf7-4b28-b95f-5c45f00679cb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.899557 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.899582 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.899591 5184 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.900992 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-config" (OuterVolumeSpecName: "config") pod "da615b33-3bf7-4b28-b95f-5c45f00679cb" (UID: "da615b33-3bf7-4b28-b95f-5c45f00679cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.905516 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da615b33-3bf7-4b28-b95f-5c45f00679cb" (UID: "da615b33-3bf7-4b28-b95f-5c45f00679cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:19 crc kubenswrapper[5184]: I0312 17:09:19.922016 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.001157 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/af22a969-32c4-4628-8667-be2162f7d92d-horizon-secret-key\") pod \"af22a969-32c4-4628-8667-be2162f7d92d\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.001285 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp67s\" (UniqueName: \"kubernetes.io/projected/646cdc1b-863a-4b58-8869-fcbc386a96e2-kube-api-access-cp67s\") pod \"646cdc1b-863a-4b58-8869-fcbc386a96e2\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.001322 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-scripts\") pod \"af22a969-32c4-4628-8667-be2162f7d92d\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.001341 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-config-data\") pod \"af22a969-32c4-4628-8667-be2162f7d92d\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.001398 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-scripts\") pod \"646cdc1b-863a-4b58-8869-fcbc386a96e2\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.001490 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/646cdc1b-863a-4b58-8869-fcbc386a96e2-horizon-secret-key\") pod \"646cdc1b-863a-4b58-8869-fcbc386a96e2\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.001518 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-config-data\") pod \"646cdc1b-863a-4b58-8869-fcbc386a96e2\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.001541 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lz69\" (UniqueName: \"kubernetes.io/projected/af22a969-32c4-4628-8667-be2162f7d92d-kube-api-access-6lz69\") pod \"af22a969-32c4-4628-8667-be2162f7d92d\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.001667 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af22a969-32c4-4628-8667-be2162f7d92d-logs\") pod \"af22a969-32c4-4628-8667-be2162f7d92d\" (UID: \"af22a969-32c4-4628-8667-be2162f7d92d\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.001782 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/646cdc1b-863a-4b58-8869-fcbc386a96e2-logs\") pod \"646cdc1b-863a-4b58-8869-fcbc386a96e2\" (UID: \"646cdc1b-863a-4b58-8869-fcbc386a96e2\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.002282 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.002310 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da615b33-3bf7-4b28-b95f-5c45f00679cb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.002714 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/646cdc1b-863a-4b58-8869-fcbc386a96e2-logs" (OuterVolumeSpecName: "logs") pod "646cdc1b-863a-4b58-8869-fcbc386a96e2" (UID: "646cdc1b-863a-4b58-8869-fcbc386a96e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.003958 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af22a969-32c4-4628-8667-be2162f7d92d-logs" (OuterVolumeSpecName: "logs") pod "af22a969-32c4-4628-8667-be2162f7d92d" (UID: "af22a969-32c4-4628-8667-be2162f7d92d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.007344 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af22a969-32c4-4628-8667-be2162f7d92d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "af22a969-32c4-4628-8667-be2162f7d92d" (UID: "af22a969-32c4-4628-8667-be2162f7d92d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.007334 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/646cdc1b-863a-4b58-8869-fcbc386a96e2-kube-api-access-cp67s" (OuterVolumeSpecName: "kube-api-access-cp67s") pod "646cdc1b-863a-4b58-8869-fcbc386a96e2" (UID: "646cdc1b-863a-4b58-8869-fcbc386a96e2"). InnerVolumeSpecName "kube-api-access-cp67s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.009333 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af22a969-32c4-4628-8667-be2162f7d92d-kube-api-access-6lz69" (OuterVolumeSpecName: "kube-api-access-6lz69") pod "af22a969-32c4-4628-8667-be2162f7d92d" (UID: "af22a969-32c4-4628-8667-be2162f7d92d"). InnerVolumeSpecName "kube-api-access-6lz69". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.013048 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646cdc1b-863a-4b58-8869-fcbc386a96e2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "646cdc1b-863a-4b58-8869-fcbc386a96e2" (UID: "646cdc1b-863a-4b58-8869-fcbc386a96e2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.028274 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-config-data" (OuterVolumeSpecName: "config-data") pod "646cdc1b-863a-4b58-8869-fcbc386a96e2" (UID: "646cdc1b-863a-4b58-8869-fcbc386a96e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.031576 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-scripts" (OuterVolumeSpecName: "scripts") pod "646cdc1b-863a-4b58-8869-fcbc386a96e2" (UID: "646cdc1b-863a-4b58-8869-fcbc386a96e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.036864 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-scripts" (OuterVolumeSpecName: "scripts") pod "af22a969-32c4-4628-8667-be2162f7d92d" (UID: "af22a969-32c4-4628-8667-be2162f7d92d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.047861 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-config-data" (OuterVolumeSpecName: "config-data") pod "af22a969-32c4-4628-8667-be2162f7d92d" (UID: "af22a969-32c4-4628-8667-be2162f7d92d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.089252 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" event={"ID":"dfe68657-8277-4927-9fef-88807e21461b","Type":"ContainerStarted","Data":"dda890076e7d667d4ca7718455bd3de46f017a4370bb6250d83c6516e9ab839b"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.089292 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" event={"ID":"dfe68657-8277-4927-9fef-88807e21461b","Type":"ContainerStarted","Data":"86cf478a25ee94c8fe8db259f19eb92c4c4fd0b298522713fac1fe2e266ae46b"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.095635 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-574685bb47-lxzsh" event={"ID":"55402375-344d-4d12-b750-7a3e784b3886","Type":"ContainerStarted","Data":"476ad9347a18f05d59435f51d840c692ae3d4cb8905a5e69744c8da3cb458b2c"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.100589 5184 generic.go:358] "Generic (PLEG): container finished" podID="af22a969-32c4-4628-8667-be2162f7d92d" containerID="e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f" exitCode=137 Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.100614 5184 generic.go:358] "Generic (PLEG): container finished" podID="af22a969-32c4-4628-8667-be2162f7d92d" containerID="12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364" exitCode=137 Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.100620 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66446fcd8f-lmflm" event={"ID":"af22a969-32c4-4628-8667-be2162f7d92d","Type":"ContainerDied","Data":"e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.100648 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66446fcd8f-lmflm" event={"ID":"af22a969-32c4-4628-8667-be2162f7d92d","Type":"ContainerDied","Data":"12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.100659 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66446fcd8f-lmflm" event={"ID":"af22a969-32c4-4628-8667-be2162f7d92d","Type":"ContainerDied","Data":"90fcc5f8ddc8521d0271d96fe5e07fead5353acba9275fd409c0d2bc4829103e"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.100689 5184 scope.go:117] "RemoveContainer" containerID="e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.100704 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66446fcd8f-lmflm" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.103691 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad645834-0761-42a0-8bf0-dd763b829aac-logs\") pod \"ad645834-0761-42a0-8bf0-dd763b829aac\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.103889 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad645834-0761-42a0-8bf0-dd763b829aac-horizon-secret-key\") pod \"ad645834-0761-42a0-8bf0-dd763b829aac\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.103954 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-config-data\") pod \"ad645834-0761-42a0-8bf0-dd763b829aac\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104004 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8phq\" (UniqueName: \"kubernetes.io/projected/ad645834-0761-42a0-8bf0-dd763b829aac-kube-api-access-v8phq\") pod \"ad645834-0761-42a0-8bf0-dd763b829aac\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104053 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-scripts\") pod \"ad645834-0761-42a0-8bf0-dd763b829aac\" (UID: \"ad645834-0761-42a0-8bf0-dd763b829aac\") " Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104430 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af22a969-32c4-4628-8667-be2162f7d92d-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104445 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/646cdc1b-863a-4b58-8869-fcbc386a96e2-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104454 5184 reconciler_common.go:299] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/af22a969-32c4-4628-8667-be2162f7d92d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104463 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cp67s\" (UniqueName: \"kubernetes.io/projected/646cdc1b-863a-4b58-8869-fcbc386a96e2-kube-api-access-cp67s\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104471 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104478 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af22a969-32c4-4628-8667-be2162f7d92d-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104486 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104493 5184 reconciler_common.go:299] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/646cdc1b-863a-4b58-8869-fcbc386a96e2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104501 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/646cdc1b-863a-4b58-8869-fcbc386a96e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104508 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6lz69\" (UniqueName: \"kubernetes.io/projected/af22a969-32c4-4628-8667-be2162f7d92d-kube-api-access-6lz69\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.104989 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"246d17d3-b07a-4fe4-8165-711bcd72517f","Type":"ContainerStarted","Data":"5fdf1d7ebac6c679e3f972505eb5eb9e67aa6da80703bae62de7ecf931c1400d"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.105268 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad645834-0761-42a0-8bf0-dd763b829aac-logs" (OuterVolumeSpecName: "logs") pod "ad645834-0761-42a0-8bf0-dd763b829aac" (UID: "ad645834-0761-42a0-8bf0-dd763b829aac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.130878 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad645834-0761-42a0-8bf0-dd763b829aac-kube-api-access-v8phq" (OuterVolumeSpecName: "kube-api-access-v8phq") pod "ad645834-0761-42a0-8bf0-dd763b829aac" (UID: "ad645834-0761-42a0-8bf0-dd763b829aac"). InnerVolumeSpecName "kube-api-access-v8phq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.131137 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad645834-0761-42a0-8bf0-dd763b829aac-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ad645834-0761-42a0-8bf0-dd763b829aac" (UID: "ad645834-0761-42a0-8bf0-dd763b829aac"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.133504 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-84f8b689b8-ptbnb" podStartSLOduration=11.32986325 podStartE2EDuration="18.133489339s" podCreationTimestamp="2026-03-12 17:09:02 +0000 UTC" firstStartedPulling="2026-03-12 17:09:11.811150381 +0000 UTC m=+1094.352461720" lastFinishedPulling="2026-03-12 17:09:18.61477647 +0000 UTC m=+1101.156087809" observedRunningTime="2026-03-12 17:09:20.113024466 +0000 UTC m=+1102.654335805" watchObservedRunningTime="2026-03-12 17:09:20.133489339 +0000 UTC m=+1102.674800678" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.156712 5184 generic.go:358] "Generic (PLEG): container finished" podID="646cdc1b-863a-4b58-8869-fcbc386a96e2" containerID="030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72" exitCode=137 Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.156755 5184 generic.go:358] "Generic (PLEG): container finished" podID="646cdc1b-863a-4b58-8869-fcbc386a96e2" containerID="7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9" exitCode=137 Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.156865 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fc5cfbb9-trxq5" event={"ID":"646cdc1b-863a-4b58-8869-fcbc386a96e2","Type":"ContainerDied","Data":"030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.156896 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fc5cfbb9-trxq5" event={"ID":"646cdc1b-863a-4b58-8869-fcbc386a96e2","Type":"ContainerDied","Data":"7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.156974 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fc5cfbb9-trxq5" event={"ID":"646cdc1b-863a-4b58-8869-fcbc386a96e2","Type":"ContainerDied","Data":"7d45979f4b81e06da62a9752ff1c6a0e7a4c65e5c6ff371738893ceafa67b541"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.157073 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fc5cfbb9-trxq5" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.172021 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-scripts" (OuterVolumeSpecName: "scripts") pod "ad645834-0761-42a0-8bf0-dd763b829aac" (UID: "ad645834-0761-42a0-8bf0-dd763b829aac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.173243 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-config-data" (OuterVolumeSpecName: "config-data") pod "ad645834-0761-42a0-8bf0-dd763b829aac" (UID: "ad645834-0761-42a0-8bf0-dd763b829aac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.190738 5184 generic.go:358] "Generic (PLEG): container finished" podID="ad645834-0761-42a0-8bf0-dd763b829aac" containerID="322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055" exitCode=137 Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.190777 5184 generic.go:358] "Generic (PLEG): container finished" podID="ad645834-0761-42a0-8bf0-dd763b829aac" containerID="047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537" exitCode=137 Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.190867 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b468785-jt9ph" event={"ID":"ad645834-0761-42a0-8bf0-dd763b829aac","Type":"ContainerDied","Data":"322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.190895 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b468785-jt9ph" event={"ID":"ad645834-0761-42a0-8bf0-dd763b829aac","Type":"ContainerDied","Data":"047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.190907 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b468785-jt9ph" event={"ID":"ad645834-0761-42a0-8bf0-dd763b829aac","Type":"ContainerDied","Data":"2d4f8d974716a680f2dd7afdc80e320c44b685e551dd94303c2bf236e0a9d37e"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.190988 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844b468785-jt9ph" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.201266 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-574685bb47-lxzsh" podStartSLOduration=11.469071321 podStartE2EDuration="18.201224596s" podCreationTimestamp="2026-03-12 17:09:02 +0000 UTC" firstStartedPulling="2026-03-12 17:09:11.850959471 +0000 UTC m=+1094.392270810" lastFinishedPulling="2026-03-12 17:09:18.583112746 +0000 UTC m=+1101.124424085" observedRunningTime="2026-03-12 17:09:20.172352669 +0000 UTC m=+1102.713664008" watchObservedRunningTime="2026-03-12 17:09:20.201224596 +0000 UTC m=+1102.742535935" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.205821 5184 reconciler_common.go:299] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ad645834-0761-42a0-8bf0-dd763b829aac-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.205840 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.205850 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v8phq\" (UniqueName: \"kubernetes.io/projected/ad645834-0761-42a0-8bf0-dd763b829aac-kube-api-access-v8phq\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.205860 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad645834-0761-42a0-8bf0-dd763b829aac-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.205868 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad645834-0761-42a0-8bf0-dd763b829aac-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.207683 5184 generic.go:358] "Generic (PLEG): container finished" podID="da615b33-3bf7-4b28-b95f-5c45f00679cb" containerID="699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf" exitCode=0 Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.207857 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.207899 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" event={"ID":"da615b33-3bf7-4b28-b95f-5c45f00679cb","Type":"ContainerDied","Data":"699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.207928 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77cdfb9675-lcv79" event={"ID":"da615b33-3bf7-4b28-b95f-5c45f00679cb","Type":"ContainerDied","Data":"2913d1ae42fa93d9db23f7e84112804fbb04dacff5d8083808a00a1c849ce13b"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.215749 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc87f-mb797" event={"ID":"7d33c92d-847e-48b5-bb1b-a8defe0756f7","Type":"ContainerStarted","Data":"6abf4815e6024da2d2e5511d3112e2b6c2bb747058927306f1928e25cfd11214"} Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.215788 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.237058 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66446fcd8f-lmflm"] Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.281341 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-66446fcd8f-lmflm"] Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.290874 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9fc87f-mb797" podStartSLOduration=7.29085203 podStartE2EDuration="7.29085203s" podCreationTimestamp="2026-03-12 17:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:09:20.269153899 +0000 UTC m=+1102.810465238" watchObservedRunningTime="2026-03-12 17:09:20.29085203 +0000 UTC m=+1102.832163369" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.338700 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-844b468785-jt9ph"] Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.356359 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-844b468785-jt9ph"] Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.384060 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85fc5cfbb9-trxq5"] Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.389796 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-85fc5cfbb9-trxq5"] Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.416098 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="646cdc1b-863a-4b58-8869-fcbc386a96e2" path="/var/lib/kubelet/pods/646cdc1b-863a-4b58-8869-fcbc386a96e2/volumes" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.418198 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad645834-0761-42a0-8bf0-dd763b829aac" path="/var/lib/kubelet/pods/ad645834-0761-42a0-8bf0-dd763b829aac/volumes" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.418823 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af22a969-32c4-4628-8667-be2162f7d92d" path="/var/lib/kubelet/pods/af22a969-32c4-4628-8667-be2162f7d92d/volumes" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.419996 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77cdfb9675-lcv79"] Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.421429 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77cdfb9675-lcv79"] Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.429191 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.574536 5184 scope.go:117] "RemoveContainer" containerID="12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.609781 5184 scope.go:117] "RemoveContainer" containerID="e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f" Mar 12 17:09:20 crc kubenswrapper[5184]: E0312 17:09:20.610206 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f\": container with ID starting with e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f not found: ID does not exist" containerID="e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.610250 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f"} err="failed to get container status \"e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f\": rpc error: code = NotFound desc = could not find container \"e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f\": container with ID starting with e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f not found: ID does not exist" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.610294 5184 scope.go:117] "RemoveContainer" containerID="12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364" Mar 12 17:09:20 crc kubenswrapper[5184]: E0312 17:09:20.611038 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364\": container with ID starting with 12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364 not found: ID does not exist" containerID="12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.611063 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364"} err="failed to get container status \"12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364\": rpc error: code = NotFound desc = could not find container \"12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364\": container with ID starting with 12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364 not found: ID does not exist" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.611078 5184 scope.go:117] "RemoveContainer" containerID="e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.611323 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f"} err="failed to get container status \"e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f\": rpc error: code = NotFound desc = could not find container \"e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f\": container with ID starting with e97f3475f0fb80f59b27d2fd2949008caef58ad2a5e4fd52a06908adf8ffe29f not found: ID does not exist" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.611356 5184 scope.go:117] "RemoveContainer" containerID="12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.616540 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364"} err="failed to get container status \"12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364\": rpc error: code = NotFound desc = could not find container \"12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364\": container with ID starting with 12f64b283f8a5bf8be5f912ea3c3eeb46e8266cbb7d4905e54743a6fc4b54364 not found: ID does not exist" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.616569 5184 scope.go:117] "RemoveContainer" containerID="030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.742241 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.742298 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.902274 5184 scope.go:117] "RemoveContainer" containerID="7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.934471 5184 scope.go:117] "RemoveContainer" containerID="030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72" Mar 12 17:09:20 crc kubenswrapper[5184]: E0312 17:09:20.934918 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72\": container with ID starting with 030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72 not found: ID does not exist" containerID="030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.934949 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72"} err="failed to get container status \"030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72\": rpc error: code = NotFound desc = could not find container \"030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72\": container with ID starting with 030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72 not found: ID does not exist" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.934969 5184 scope.go:117] "RemoveContainer" containerID="7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9" Mar 12 17:09:20 crc kubenswrapper[5184]: E0312 17:09:20.935395 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9\": container with ID starting with 7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9 not found: ID does not exist" containerID="7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.935446 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9"} err="failed to get container status \"7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9\": rpc error: code = NotFound desc = could not find container \"7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9\": container with ID starting with 7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9 not found: ID does not exist" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.935476 5184 scope.go:117] "RemoveContainer" containerID="030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.937352 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72"} err="failed to get container status \"030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72\": rpc error: code = NotFound desc = could not find container \"030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72\": container with ID starting with 030149c8b5637105bbee193dd218339dff5fa4b7dcdea135f43ed6f21bb3cb72 not found: ID does not exist" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.937393 5184 scope.go:117] "RemoveContainer" containerID="7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.937583 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9"} err="failed to get container status \"7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9\": rpc error: code = NotFound desc = could not find container \"7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9\": container with ID starting with 7d3843f64ae7e3280518f6603612e635187d09ba68bd5996d6c0058c860a6cd9 not found: ID does not exist" Mar 12 17:09:20 crc kubenswrapper[5184]: I0312 17:09:20.937606 5184 scope.go:117] "RemoveContainer" containerID="322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.239033 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"246d17d3-b07a-4fe4-8165-711bcd72517f","Type":"ContainerStarted","Data":"7ec0a142a102ad955ebc109ed409ce67b8b6a9fd3b5631ad22fc17d1ab354718"} Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.239118 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/cinder-api-0" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.239108 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerName="cinder-api-log" containerID="cri-o://5fdf1d7ebac6c679e3f972505eb5eb9e67aa6da80703bae62de7ecf931c1400d" gracePeriod=30 Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.239167 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerName="cinder-api" containerID="cri-o://7ec0a142a102ad955ebc109ed409ce67b8b6a9fd3b5631ad22fc17d1ab354718" gracePeriod=30 Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.267889 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"001b7f9f-058c-4037-af26-b94505164a68","Type":"ContainerStarted","Data":"98670b700c08ff6f9a4c2a47b41060b4256ca035570ab99ec132b7640d7f18ce"} Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.317798 5184 scope.go:117] "RemoveContainer" containerID="047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.339595 5184 scope.go:117] "RemoveContainer" containerID="322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055" Mar 12 17:09:21 crc kubenswrapper[5184]: E0312 17:09:21.339956 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055\": container with ID starting with 322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055 not found: ID does not exist" containerID="322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.340006 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055"} err="failed to get container status \"322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055\": rpc error: code = NotFound desc = could not find container \"322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055\": container with ID starting with 322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055 not found: ID does not exist" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.340026 5184 scope.go:117] "RemoveContainer" containerID="047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537" Mar 12 17:09:21 crc kubenswrapper[5184]: E0312 17:09:21.340435 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537\": container with ID starting with 047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537 not found: ID does not exist" containerID="047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.340461 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537"} err="failed to get container status \"047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537\": rpc error: code = NotFound desc = could not find container \"047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537\": container with ID starting with 047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537 not found: ID does not exist" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.340474 5184 scope.go:117] "RemoveContainer" containerID="322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.340853 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055"} err="failed to get container status \"322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055\": rpc error: code = NotFound desc = could not find container \"322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055\": container with ID starting with 322636176df9af1ac8acdec415b4ffd8f4652698259988fc8b6152a07ec3e055 not found: ID does not exist" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.340870 5184 scope.go:117] "RemoveContainer" containerID="047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.342539 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537"} err="failed to get container status \"047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537\": rpc error: code = NotFound desc = could not find container \"047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537\": container with ID starting with 047d4ada29c754723425e0d2407189ea2173d2e73428211034634873730ed537 not found: ID does not exist" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.342559 5184 scope.go:117] "RemoveContainer" containerID="699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.555201 5184 scope.go:117] "RemoveContainer" containerID="699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf" Mar 12 17:09:21 crc kubenswrapper[5184]: E0312 17:09:21.556758 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf\": container with ID starting with 699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf not found: ID does not exist" containerID="699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf" Mar 12 17:09:21 crc kubenswrapper[5184]: I0312 17:09:21.556810 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf"} err="failed to get container status \"699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf\": rpc error: code = NotFound desc = could not find container \"699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf\": container with ID starting with 699870c0956c723f907930eb010819f3e84bcd9207534ce43cdd1449d839cdcf not found: ID does not exist" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.279616 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"001b7f9f-058c-4037-af26-b94505164a68","Type":"ContainerStarted","Data":"c5d635b48f08f2105dfb09e88281a82178fa74ec722a9a19b0a1c94daaff8563"} Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.282371 5184 generic.go:358] "Generic (PLEG): container finished" podID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerID="5fdf1d7ebac6c679e3f972505eb5eb9e67aa6da80703bae62de7ecf931c1400d" exitCode=143 Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.282482 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"246d17d3-b07a-4fe4-8165-711bcd72517f","Type":"ContainerDied","Data":"5fdf1d7ebac6c679e3f972505eb5eb9e67aa6da80703bae62de7ecf931c1400d"} Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.300259 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=9.300238419 podStartE2EDuration="9.300238419s" podCreationTimestamp="2026-03-12 17:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:09:21.259336138 +0000 UTC m=+1103.800647487" watchObservedRunningTime="2026-03-12 17:09:22.300238419 +0000 UTC m=+1104.841549758" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.308228 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.933304299 podStartE2EDuration="9.3082052s" podCreationTimestamp="2026-03-12 17:09:13 +0000 UTC" firstStartedPulling="2026-03-12 17:09:18.523604108 +0000 UTC m=+1101.064915447" lastFinishedPulling="2026-03-12 17:09:19.898505019 +0000 UTC m=+1102.439816348" observedRunningTime="2026-03-12 17:09:22.303536863 +0000 UTC m=+1104.844848202" watchObservedRunningTime="2026-03-12 17:09:22.3082052 +0000 UTC m=+1104.849516539" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.364888 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.422245 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da615b33-3bf7-4b28-b95f-5c45f00679cb" path="/var/lib/kubelet/pods/da615b33-3bf7-4b28-b95f-5c45f00679cb/volumes" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.594787 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b69fb49f7-dmgsd"] Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.595549 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/neutron-b69fb49f7-dmgsd" podUID="0765e7b4-b879-4989-8a31-486408b9cdce" containerName="neutron-api" containerID="cri-o://3b99a04cd7775019fa369700f675593f34d3fef966fdf724f5a08caef1917514" gracePeriod=30 Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.595632 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/neutron-b69fb49f7-dmgsd" podUID="0765e7b4-b879-4989-8a31-486408b9cdce" containerName="neutron-httpd" containerID="cri-o://57ce9289747a13102e2b8b4676fe53e2205ceb948f0161b7a051a28d12aeb302" gracePeriod=30 Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.611087 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.616226 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/neutron-7949fc945c-bns65"] Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617214 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af22a969-32c4-4628-8667-be2162f7d92d" containerName="horizon-log" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617233 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="af22a969-32c4-4628-8667-be2162f7d92d" containerName="horizon-log" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617244 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad645834-0761-42a0-8bf0-dd763b829aac" containerName="horizon" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617251 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad645834-0761-42a0-8bf0-dd763b829aac" containerName="horizon" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617261 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da615b33-3bf7-4b28-b95f-5c45f00679cb" containerName="init" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617267 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="da615b33-3bf7-4b28-b95f-5c45f00679cb" containerName="init" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617280 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad645834-0761-42a0-8bf0-dd763b829aac" containerName="horizon-log" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617286 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad645834-0761-42a0-8bf0-dd763b829aac" containerName="horizon-log" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617331 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af22a969-32c4-4628-8667-be2162f7d92d" containerName="horizon" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617337 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="af22a969-32c4-4628-8667-be2162f7d92d" containerName="horizon" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617348 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="646cdc1b-863a-4b58-8869-fcbc386a96e2" containerName="horizon" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617353 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="646cdc1b-863a-4b58-8869-fcbc386a96e2" containerName="horizon" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617365 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="646cdc1b-863a-4b58-8869-fcbc386a96e2" containerName="horizon-log" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617370 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="646cdc1b-863a-4b58-8869-fcbc386a96e2" containerName="horizon-log" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617543 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="af22a969-32c4-4628-8667-be2162f7d92d" containerName="horizon-log" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617557 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="af22a969-32c4-4628-8667-be2162f7d92d" containerName="horizon" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617564 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad645834-0761-42a0-8bf0-dd763b829aac" containerName="horizon" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617576 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad645834-0761-42a0-8bf0-dd763b829aac" containerName="horizon-log" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617591 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="646cdc1b-863a-4b58-8869-fcbc386a96e2" containerName="horizon" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617600 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="646cdc1b-863a-4b58-8869-fcbc386a96e2" containerName="horizon-log" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.617607 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="da615b33-3bf7-4b28-b95f-5c45f00679cb" containerName="init" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.648689 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7949fc945c-bns65"] Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.648826 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.705566 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7cd5c99b94-hgvbf" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.766204 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-config\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.766273 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-public-tls-certs\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.766330 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-ovndb-tls-certs\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.766345 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-combined-ca-bundle\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.766363 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpv64\" (UniqueName: \"kubernetes.io/projected/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-kube-api-access-gpv64\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.766416 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-internal-tls-certs\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.766490 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-httpd-config\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.776645 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-859ddbd78-2m2xk"] Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.776902 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-859ddbd78-2m2xk" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerName="horizon-log" containerID="cri-o://881e288299526eeda5e8bb60032448ab57c594785adad11983b46f43c6e06ae3" gracePeriod=30 Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.777794 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/horizon-859ddbd78-2m2xk" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerName="horizon" containerID="cri-o://1b15bf9717411285614dfab9d07e6784fa5f11a28ed47e3b1c3e31c203d181c9" gracePeriod=30 Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.867896 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-config\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.867969 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-public-tls-certs\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.868051 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-ovndb-tls-certs\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.868077 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-combined-ca-bundle\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.868102 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpv64\" (UniqueName: \"kubernetes.io/projected/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-kube-api-access-gpv64\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.868149 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-internal-tls-certs\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.868278 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-httpd-config\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.875281 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-config\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.875915 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-internal-tls-certs\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.877346 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-ovndb-tls-certs\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.888657 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-httpd-config\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.889250 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-combined-ca-bundle\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.889473 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpv64\" (UniqueName: \"kubernetes.io/projected/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-kube-api-access-gpv64\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.893300 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd31119b-8094-4cd5-ad64-6786cd8c7dbe-public-tls-certs\") pod \"neutron-7949fc945c-bns65\" (UID: \"dd31119b-8094-4cd5-ad64-6786cd8c7dbe\") " pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:22 crc kubenswrapper[5184]: I0312 17:09:22.965540 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:23 crc kubenswrapper[5184]: I0312 17:09:23.301830 5184 generic.go:358] "Generic (PLEG): container finished" podID="0765e7b4-b879-4989-8a31-486408b9cdce" containerID="57ce9289747a13102e2b8b4676fe53e2205ceb948f0161b7a051a28d12aeb302" exitCode=0 Mar 12 17:09:23 crc kubenswrapper[5184]: I0312 17:09:23.302162 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b69fb49f7-dmgsd" event={"ID":"0765e7b4-b879-4989-8a31-486408b9cdce","Type":"ContainerDied","Data":"57ce9289747a13102e2b8b4676fe53e2205ceb948f0161b7a051a28d12aeb302"} Mar 12 17:09:23 crc kubenswrapper[5184]: I0312 17:09:23.304830 5184 generic.go:358] "Generic (PLEG): container finished" podID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerID="1b15bf9717411285614dfab9d07e6784fa5f11a28ed47e3b1c3e31c203d181c9" exitCode=0 Mar 12 17:09:23 crc kubenswrapper[5184]: I0312 17:09:23.304915 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859ddbd78-2m2xk" event={"ID":"ccf562d2-6ce1-4eb6-b27e-679493ce3870","Type":"ContainerDied","Data":"1b15bf9717411285614dfab9d07e6784fa5f11a28ed47e3b1c3e31c203d181c9"} Mar 12 17:09:23 crc kubenswrapper[5184]: I0312 17:09:23.572708 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 12 17:09:25 crc kubenswrapper[5184]: I0312 17:09:25.451941 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:09:25 crc kubenswrapper[5184]: I0312 17:09:25.511797 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:09:25 crc kubenswrapper[5184]: I0312 17:09:25.683478 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dplgw"] Mar 12 17:09:26 crc kubenswrapper[5184]: I0312 17:09:26.270018 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:09:26 crc kubenswrapper[5184]: I0312 17:09:26.391135 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57fc7b664c-5c5pt"] Mar 12 17:09:26 crc kubenswrapper[5184]: I0312 17:09:26.391392 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" podUID="013bb197-eb0e-4632-90f8-547f4452101e" containerName="dnsmasq-dns" containerID="cri-o://73563c03de213797146af3ecfeee7554e26f5bda4f7b0baa55a1074fb56d8797" gracePeriod=10 Mar 12 17:09:26 crc kubenswrapper[5184]: I0312 17:09:26.568744 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:09:26 crc kubenswrapper[5184]: I0312 17:09:26.624976 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:09:26 crc kubenswrapper[5184]: I0312 17:09:26.843134 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:26 crc kubenswrapper[5184]: I0312 17:09:26.955515 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:27 crc kubenswrapper[5184]: I0312 17:09:27.089796 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:27 crc kubenswrapper[5184]: I0312 17:09:27.239079 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-66c64686b6-kwvcj" Mar 12 17:09:27 crc kubenswrapper[5184]: I0312 17:09:27.308876 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5865f6c4b6-r8frb"] Mar 12 17:09:27 crc kubenswrapper[5184]: I0312 17:09:27.439188 5184 generic.go:358] "Generic (PLEG): container finished" podID="013bb197-eb0e-4632-90f8-547f4452101e" containerID="73563c03de213797146af3ecfeee7554e26f5bda4f7b0baa55a1074fb56d8797" exitCode=0 Mar 12 17:09:27 crc kubenswrapper[5184]: I0312 17:09:27.439439 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" event={"ID":"013bb197-eb0e-4632-90f8-547f4452101e","Type":"ContainerDied","Data":"73563c03de213797146af3ecfeee7554e26f5bda4f7b0baa55a1074fb56d8797"} Mar 12 17:09:27 crc kubenswrapper[5184]: I0312 17:09:27.439812 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dplgw" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="registry-server" containerID="cri-o://6ca1f0347c3e06245aeafb34315f536429f7af0ae6532d71a76abc294f5798c0" gracePeriod=2 Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.093214 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z62x6"] Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.527572 5184 generic.go:358] "Generic (PLEG): container finished" podID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerID="6ca1f0347c3e06245aeafb34315f536429f7af0ae6532d71a76abc294f5798c0" exitCode=0 Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.527745 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dplgw" event={"ID":"d3f7d154-f90e-4731-bc04-00b13b3fbfd8","Type":"ContainerDied","Data":"6ca1f0347c3e06245aeafb34315f536429f7af0ae6532d71a76abc294f5798c0"} Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.528089 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z62x6" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="registry-server" containerID="cri-o://af15e72be38b83f42e895efc6440a892f9566b54625ee6e6698da088011c7906" gracePeriod=2 Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.528275 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/barbican-api-5865f6c4b6-r8frb" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api-log" containerID="cri-o://46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e" gracePeriod=30 Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.528786 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/barbican-api-5865f6c4b6-r8frb" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api" containerID="cri-o://1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026" gracePeriod=30 Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.541619 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5865f6c4b6-r8frb" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": EOF" Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.541639 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5865f6c4b6-r8frb" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": EOF" Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.541619 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5865f6c4b6-r8frb" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": EOF" Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.851340 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 12 17:09:28 crc kubenswrapper[5184]: I0312 17:09:28.907034 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.323031 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.463952 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-svc\") pod \"013bb197-eb0e-4632-90f8-547f4452101e\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.464263 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skt42\" (UniqueName: \"kubernetes.io/projected/013bb197-eb0e-4632-90f8-547f4452101e-kube-api-access-skt42\") pod \"013bb197-eb0e-4632-90f8-547f4452101e\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.464319 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-nb\") pod \"013bb197-eb0e-4632-90f8-547f4452101e\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.464422 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-config\") pod \"013bb197-eb0e-4632-90f8-547f4452101e\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.464495 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-swift-storage-0\") pod \"013bb197-eb0e-4632-90f8-547f4452101e\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.464529 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-sb\") pod \"013bb197-eb0e-4632-90f8-547f4452101e\" (UID: \"013bb197-eb0e-4632-90f8-547f4452101e\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.480760 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013bb197-eb0e-4632-90f8-547f4452101e-kube-api-access-skt42" (OuterVolumeSpecName: "kube-api-access-skt42") pod "013bb197-eb0e-4632-90f8-547f4452101e" (UID: "013bb197-eb0e-4632-90f8-547f4452101e"). InnerVolumeSpecName "kube-api-access-skt42". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.505916 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.543288 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "013bb197-eb0e-4632-90f8-547f4452101e" (UID: "013bb197-eb0e-4632-90f8-547f4452101e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.570318 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "013bb197-eb0e-4632-90f8-547f4452101e" (UID: "013bb197-eb0e-4632-90f8-547f4452101e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.574975 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-config" (OuterVolumeSpecName: "config") pod "013bb197-eb0e-4632-90f8-547f4452101e" (UID: "013bb197-eb0e-4632-90f8-547f4452101e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.577413 5184 generic.go:358] "Generic (PLEG): container finished" podID="3521e399-e317-459a-badc-0b4695197ac0" containerID="af15e72be38b83f42e895efc6440a892f9566b54625ee6e6698da088011c7906" exitCode=0 Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.577566 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z62x6" event={"ID":"3521e399-e317-459a-badc-0b4695197ac0","Type":"ContainerDied","Data":"af15e72be38b83f42e895efc6440a892f9566b54625ee6e6698da088011c7906"} Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.580865 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-skt42\" (UniqueName: \"kubernetes.io/projected/013bb197-eb0e-4632-90f8-547f4452101e-kube-api-access-skt42\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.580938 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.580998 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.581060 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.586893 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "013bb197-eb0e-4632-90f8-547f4452101e" (UID: "013bb197-eb0e-4632-90f8-547f4452101e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.604406 5184 generic.go:358] "Generic (PLEG): container finished" podID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerID="46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e" exitCode=143 Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.604825 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5865f6c4b6-r8frb" event={"ID":"f0ce447b-3a61-4c39-9f58-12985dbdb754","Type":"ContainerDied","Data":"46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e"} Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.617647 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="001b7f9f-058c-4037-af26-b94505164a68" containerName="cinder-scheduler" containerID="cri-o://98670b700c08ff6f9a4c2a47b41060b4256ca035570ab99ec132b7640d7f18ce" gracePeriod=30 Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.618057 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.618069 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57fc7b664c-5c5pt" event={"ID":"013bb197-eb0e-4632-90f8-547f4452101e","Type":"ContainerDied","Data":"43ec1527ab1c19dd2cd035ec33fb4862fedc21da31b0a83585feada4917d9b7e"} Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.619051 5184 scope.go:117] "RemoveContainer" containerID="73563c03de213797146af3ecfeee7554e26f5bda4f7b0baa55a1074fb56d8797" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.618119 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="001b7f9f-058c-4037-af26-b94505164a68" containerName="probe" containerID="cri-o://c5d635b48f08f2105dfb09e88281a82178fa74ec722a9a19b0a1c94daaff8563" gracePeriod=30 Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.661765 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.669906 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "013bb197-eb0e-4632-90f8-547f4452101e" (UID: "013bb197-eb0e-4632-90f8-547f4452101e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.676827 5184 scope.go:117] "RemoveContainer" containerID="fe1ac0d55ed0615152e6b57fc5ecdb053d80f29d233ac256a283434a78ee3f31" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.682475 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.682768 5184 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/013bb197-eb0e-4632-90f8-547f4452101e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.729141 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.783824 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-utilities\") pod \"3521e399-e317-459a-badc-0b4695197ac0\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.785092 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-utilities" (OuterVolumeSpecName: "utilities") pod "3521e399-e317-459a-badc-0b4695197ac0" (UID: "3521e399-e317-459a-badc-0b4695197ac0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.785314 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9xjv\" (UniqueName: \"kubernetes.io/projected/3521e399-e317-459a-badc-0b4695197ac0-kube-api-access-n9xjv\") pod \"3521e399-e317-459a-badc-0b4695197ac0\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.785450 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-catalog-content\") pod \"3521e399-e317-459a-badc-0b4695197ac0\" (UID: \"3521e399-e317-459a-badc-0b4695197ac0\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.785771 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.791531 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3521e399-e317-459a-badc-0b4695197ac0-kube-api-access-n9xjv" (OuterVolumeSpecName: "kube-api-access-n9xjv") pod "3521e399-e317-459a-badc-0b4695197ac0" (UID: "3521e399-e317-459a-badc-0b4695197ac0"). InnerVolumeSpecName "kube-api-access-n9xjv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.818952 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3521e399-e317-459a-badc-0b4695197ac0" (UID: "3521e399-e317-459a-badc-0b4695197ac0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.887990 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-utilities\") pod \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.888325 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-catalog-content\") pod \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.888396 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz64q\" (UniqueName: \"kubernetes.io/projected/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-kube-api-access-rz64q\") pod \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\" (UID: \"d3f7d154-f90e-4731-bc04-00b13b3fbfd8\") " Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.888804 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n9xjv\" (UniqueName: \"kubernetes.io/projected/3521e399-e317-459a-badc-0b4695197ac0-kube-api-access-n9xjv\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.888822 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3521e399-e317-459a-badc-0b4695197ac0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.900558 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-kube-api-access-rz64q" (OuterVolumeSpecName: "kube-api-access-rz64q") pod "d3f7d154-f90e-4731-bc04-00b13b3fbfd8" (UID: "d3f7d154-f90e-4731-bc04-00b13b3fbfd8"). InnerVolumeSpecName "kube-api-access-rz64q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.903924 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-utilities" (OuterVolumeSpecName: "utilities") pod "d3f7d154-f90e-4731-bc04-00b13b3fbfd8" (UID: "d3f7d154-f90e-4731-bc04-00b13b3fbfd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.926343 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3f7d154-f90e-4731-bc04-00b13b3fbfd8" (UID: "d3f7d154-f90e-4731-bc04-00b13b3fbfd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.959822 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57fc7b664c-5c5pt"] Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.968524 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57fc7b664c-5c5pt"] Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.979267 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7949fc945c-bns65"] Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.991234 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.991269 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:29 crc kubenswrapper[5184]: I0312 17:09:29.991285 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rz64q\" (UniqueName: \"kubernetes.io/projected/d3f7d154-f90e-4731-bc04-00b13b3fbfd8-kube-api-access-rz64q\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.410067 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013bb197-eb0e-4632-90f8-547f4452101e" path="/var/lib/kubelet/pods/013bb197-eb0e-4632-90f8-547f4452101e/volumes" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.583944 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.627667 5184 generic.go:358] "Generic (PLEG): container finished" podID="001b7f9f-058c-4037-af26-b94505164a68" containerID="c5d635b48f08f2105dfb09e88281a82178fa74ec722a9a19b0a1c94daaff8563" exitCode=0 Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.627728 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"001b7f9f-058c-4037-af26-b94505164a68","Type":"ContainerDied","Data":"c5d635b48f08f2105dfb09e88281a82178fa74ec722a9a19b0a1c94daaff8563"} Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.631842 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z62x6" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.631833 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z62x6" event={"ID":"3521e399-e317-459a-badc-0b4695197ac0","Type":"ContainerDied","Data":"60354f4c6d032518cecb293d5051b3f5a0ef260c6cb21b2b8d5cd293cc48cd76"} Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.631981 5184 scope.go:117] "RemoveContainer" containerID="af15e72be38b83f42e895efc6440a892f9566b54625ee6e6698da088011c7906" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.635646 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dplgw" event={"ID":"d3f7d154-f90e-4731-bc04-00b13b3fbfd8","Type":"ContainerDied","Data":"226f977bc3ccfc2db5f1e65ba18c28cd7276dacba624a7cc20ef0329a4d8f6dc"} Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.635781 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dplgw" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.639106 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1","Type":"ContainerStarted","Data":"e1931efe8ab98f45a77d4e839d5f5c8986c94bac3ecf66644a727ca2ff79c21c"} Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.639193 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.639424 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="proxy-httpd" containerID="cri-o://e1931efe8ab98f45a77d4e839d5f5c8986c94bac3ecf66644a727ca2ff79c21c" gracePeriod=30 Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.639430 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="sg-core" containerID="cri-o://95f938870d1a66767fdd5c7147c287517381b4b5d229780159a9e039c953ef07" gracePeriod=30 Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.639401 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="ceilometer-central-agent" containerID="cri-o://229c265b683e912c620e9e0f730acb9df2d8475e613170a912a122643740e27e" gracePeriod=30 Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.639603 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="ceilometer-notification-agent" containerID="cri-o://943e06caa8e5d76068f507ba5cf5ebb9180ad92638b32bfce25c70ab2ff245f1" gracePeriod=30 Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.650883 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7949fc945c-bns65" event={"ID":"dd31119b-8094-4cd5-ad64-6786cd8c7dbe","Type":"ContainerStarted","Data":"3b65de6777518468d042533c772d4235a3ab94d624c27683aaab32cf25bf8924"} Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.651173 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.651281 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7949fc945c-bns65" event={"ID":"dd31119b-8094-4cd5-ad64-6786cd8c7dbe","Type":"ContainerStarted","Data":"69bbe2278de6d1e507695af832f0932f0719f4ff800de5db5eb303440e7daad7"} Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.651398 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7949fc945c-bns65" event={"ID":"dd31119b-8094-4cd5-ad64-6786cd8c7dbe","Type":"ContainerStarted","Data":"1b332926323c6d5598e8dcd9c5e12b4f6a7028066215e9e2e5de3dbd04c1f677"} Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.676330 5184 scope.go:117] "RemoveContainer" containerID="450b23bd19acd57886378b3be7292a505dff6a6c249fdfd87c698a12f81b41da" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.689430 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z62x6"] Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.699206 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z62x6"] Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.710276 5184 scope.go:117] "RemoveContainer" containerID="0038de48bfb7823a498c503e67d491b8330d5ebe76c73c5b0a09920b91693006" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.713828 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.46492907 podStartE2EDuration="1m16.713810722s" podCreationTimestamp="2026-03-12 17:08:14 +0000 UTC" firstStartedPulling="2026-03-12 17:08:17.09167063 +0000 UTC m=+1039.632981969" lastFinishedPulling="2026-03-12 17:09:29.340552292 +0000 UTC m=+1111.881863621" observedRunningTime="2026-03-12 17:09:30.706465061 +0000 UTC m=+1113.247776420" watchObservedRunningTime="2026-03-12 17:09:30.713810722 +0000 UTC m=+1113.255122061" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.732268 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dplgw"] Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.744183 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dplgw"] Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.765516 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7949fc945c-bns65" podStartSLOduration=8.765493205 podStartE2EDuration="8.765493205s" podCreationTimestamp="2026-03-12 17:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:09:30.744008621 +0000 UTC m=+1113.285319980" watchObservedRunningTime="2026-03-12 17:09:30.765493205 +0000 UTC m=+1113.306804564" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.773162 5184 scope.go:117] "RemoveContainer" containerID="6ca1f0347c3e06245aeafb34315f536429f7af0ae6532d71a76abc294f5798c0" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.807533 5184 scope.go:117] "RemoveContainer" containerID="9aa3cfb372ea4933cb3cc159b27754a0f79722067cdfef5e42fed14c31a0ded5" Mar 12 17:09:30 crc kubenswrapper[5184]: I0312 17:09:30.833596 5184 scope.go:117] "RemoveContainer" containerID="a6e0291d5c44eb2683daec10e819257ab5a5da20b54a7ad474c2eabd2ca7c154" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.669937 5184 generic.go:358] "Generic (PLEG): container finished" podID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerID="e1931efe8ab98f45a77d4e839d5f5c8986c94bac3ecf66644a727ca2ff79c21c" exitCode=0 Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.670203 5184 generic.go:358] "Generic (PLEG): container finished" podID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerID="95f938870d1a66767fdd5c7147c287517381b4b5d229780159a9e039c953ef07" exitCode=2 Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.670213 5184 generic.go:358] "Generic (PLEG): container finished" podID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerID="229c265b683e912c620e9e0f730acb9df2d8475e613170a912a122643740e27e" exitCode=0 Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.670229 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1","Type":"ContainerDied","Data":"e1931efe8ab98f45a77d4e839d5f5c8986c94bac3ecf66644a727ca2ff79c21c"} Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.670274 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1","Type":"ContainerDied","Data":"95f938870d1a66767fdd5c7147c287517381b4b5d229780159a9e039c953ef07"} Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.670284 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1","Type":"ContainerDied","Data":"229c265b683e912c620e9e0f730acb9df2d8475e613170a912a122643740e27e"} Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.681551 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.970591 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/placement-5b65568c5b-pr7s4"] Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971794 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="extract-utilities" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971819 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="extract-utilities" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971862 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="registry-server" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971870 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="registry-server" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971909 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="extract-content" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971917 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="extract-content" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971940 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="extract-utilities" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971948 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="extract-utilities" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971967 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="registry-server" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971974 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="registry-server" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971989 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="013bb197-eb0e-4632-90f8-547f4452101e" containerName="init" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.971997 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="013bb197-eb0e-4632-90f8-547f4452101e" containerName="init" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.972022 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="extract-content" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.972029 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="extract-content" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.972042 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="013bb197-eb0e-4632-90f8-547f4452101e" containerName="dnsmasq-dns" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.972048 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="013bb197-eb0e-4632-90f8-547f4452101e" containerName="dnsmasq-dns" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.972239 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="013bb197-eb0e-4632-90f8-547f4452101e" containerName="dnsmasq-dns" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.972266 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3521e399-e317-459a-badc-0b4695197ac0" containerName="registry-server" Mar 12 17:09:31 crc kubenswrapper[5184]: I0312 17:09:31.972286 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" containerName="registry-server" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.079691 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b65568c5b-pr7s4"] Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.079861 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.128333 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-combined-ca-bundle\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.128476 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-public-tls-certs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.128527 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-internal-tls-certs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.128560 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-scripts\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.128630 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d77574d5-b3c7-434b-8499-c38b3e2886e8-logs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.128660 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mmgs\" (UniqueName: \"kubernetes.io/projected/d77574d5-b3c7-434b-8499-c38b3e2886e8-kube-api-access-2mmgs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.128764 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-config-data\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.230454 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-config-data\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.230536 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-combined-ca-bundle\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.230592 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-public-tls-certs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.230617 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-internal-tls-certs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.230635 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-scripts\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.230682 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d77574d5-b3c7-434b-8499-c38b3e2886e8-logs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.230709 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mmgs\" (UniqueName: \"kubernetes.io/projected/d77574d5-b3c7-434b-8499-c38b3e2886e8-kube-api-access-2mmgs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.231579 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d77574d5-b3c7-434b-8499-c38b3e2886e8-logs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.240710 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-config-data\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.241963 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-combined-ca-bundle\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.243257 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-internal-tls-certs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.248703 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-scripts\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.249976 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77574d5-b3c7-434b-8499-c38b3e2886e8-public-tls-certs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.254793 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mmgs\" (UniqueName: \"kubernetes.io/projected/d77574d5-b3c7-434b-8499-c38b3e2886e8-kube-api-access-2mmgs\") pod \"placement-5b65568c5b-pr7s4\" (UID: \"d77574d5-b3c7-434b-8499-c38b3e2886e8\") " pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.399230 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.409834 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3521e399-e317-459a-badc-0b4695197ac0" path="/var/lib/kubelet/pods/3521e399-e317-459a-badc-0b4695197ac0/volumes" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.410489 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f7d154-f90e-4731-bc04-00b13b3fbfd8" path="/var/lib/kubelet/pods/d3f7d154-f90e-4731-bc04-00b13b3fbfd8/volumes" Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.691813 5184 generic.go:358] "Generic (PLEG): container finished" podID="001b7f9f-058c-4037-af26-b94505164a68" containerID="98670b700c08ff6f9a4c2a47b41060b4256ca035570ab99ec132b7640d7f18ce" exitCode=0 Mar 12 17:09:32 crc kubenswrapper[5184]: I0312 17:09:32.692127 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"001b7f9f-058c-4037-af26-b94505164a68","Type":"ContainerDied","Data":"98670b700c08ff6f9a4c2a47b41060b4256ca035570ab99ec132b7640d7f18ce"} Mar 12 17:09:33 crc kubenswrapper[5184]: I0312 17:09:33.591679 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b65568c5b-pr7s4"] Mar 12 17:09:33 crc kubenswrapper[5184]: W0312 17:09:33.599232 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd77574d5_b3c7_434b_8499_c38b3e2886e8.slice/crio-eb6b126d18c80179cd18c319b1a2a3a3bc145d7bba9905ab116cb03582845b74 WatchSource:0}: Error finding container eb6b126d18c80179cd18c319b1a2a3a3bc145d7bba9905ab116cb03582845b74: Status 404 returned error can't find the container with id eb6b126d18c80179cd18c319b1a2a3a3bc145d7bba9905ab116cb03582845b74 Mar 12 17:09:33 crc kubenswrapper[5184]: I0312 17:09:33.753545 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b65568c5b-pr7s4" event={"ID":"d77574d5-b3c7-434b-8499-c38b3e2886e8","Type":"ContainerStarted","Data":"eb6b126d18c80179cd18c319b1a2a3a3bc145d7bba9905ab116cb03582845b74"} Mar 12 17:09:33 crc kubenswrapper[5184]: I0312 17:09:33.762714 5184 generic.go:358] "Generic (PLEG): container finished" podID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerID="943e06caa8e5d76068f507ba5cf5ebb9180ad92638b32bfce25c70ab2ff245f1" exitCode=0 Mar 12 17:09:33 crc kubenswrapper[5184]: I0312 17:09:33.762905 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1","Type":"ContainerDied","Data":"943e06caa8e5d76068f507ba5cf5ebb9180ad92638b32bfce25c70ab2ff245f1"} Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.566719 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.591548 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5865f6c4b6-r8frb" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:50284->10.217.0.163:9311: read: connection reset by peer" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.591919 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5865f6c4b6-r8frb" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:50270->10.217.0.163:9311: read: connection reset by peer" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.741356 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-combined-ca-bundle\") pod \"001b7f9f-058c-4037-af26-b94505164a68\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.741649 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-scripts\") pod \"001b7f9f-058c-4037-af26-b94505164a68\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.741675 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data\") pod \"001b7f9f-058c-4037-af26-b94505164a68\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.741862 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkrp4\" (UniqueName: \"kubernetes.io/projected/001b7f9f-058c-4037-af26-b94505164a68-kube-api-access-kkrp4\") pod \"001b7f9f-058c-4037-af26-b94505164a68\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.742060 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/001b7f9f-058c-4037-af26-b94505164a68-etc-machine-id\") pod \"001b7f9f-058c-4037-af26-b94505164a68\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.742281 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data-custom\") pod \"001b7f9f-058c-4037-af26-b94505164a68\" (UID: \"001b7f9f-058c-4037-af26-b94505164a68\") " Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.742291 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/001b7f9f-058c-4037-af26-b94505164a68-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "001b7f9f-058c-4037-af26-b94505164a68" (UID: "001b7f9f-058c-4037-af26-b94505164a68"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.743047 5184 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/001b7f9f-058c-4037-af26-b94505164a68-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.747058 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-scripts" (OuterVolumeSpecName: "scripts") pod "001b7f9f-058c-4037-af26-b94505164a68" (UID: "001b7f9f-058c-4037-af26-b94505164a68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.747217 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "001b7f9f-058c-4037-af26-b94505164a68" (UID: "001b7f9f-058c-4037-af26-b94505164a68"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.770277 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001b7f9f-058c-4037-af26-b94505164a68-kube-api-access-kkrp4" (OuterVolumeSpecName: "kube-api-access-kkrp4") pod "001b7f9f-058c-4037-af26-b94505164a68" (UID: "001b7f9f-058c-4037-af26-b94505164a68"). InnerVolumeSpecName "kube-api-access-kkrp4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.780467 5184 generic.go:358] "Generic (PLEG): container finished" podID="0765e7b4-b879-4989-8a31-486408b9cdce" containerID="3b99a04cd7775019fa369700f675593f34d3fef966fdf724f5a08caef1917514" exitCode=0 Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.780865 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b69fb49f7-dmgsd" event={"ID":"0765e7b4-b879-4989-8a31-486408b9cdce","Type":"ContainerDied","Data":"3b99a04cd7775019fa369700f675593f34d3fef966fdf724f5a08caef1917514"} Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.787095 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"001b7f9f-058c-4037-af26-b94505164a68","Type":"ContainerDied","Data":"658ac3911f7f4392f677acb46f061cdbdbba7cb6b828648abc255de57688b3be"} Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.787162 5184 scope.go:117] "RemoveContainer" containerID="c5d635b48f08f2105dfb09e88281a82178fa74ec722a9a19b0a1c94daaff8563" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.787299 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 17:09:34 crc kubenswrapper[5184]: E0312 17:09:34.801922 5184 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0ce447b_3a61_4c39_9f58_12985dbdb754.slice/crio-1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026.scope\": RecentStats: unable to find data in memory cache]" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.804604 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "001b7f9f-058c-4037-af26-b94505164a68" (UID: "001b7f9f-058c-4037-af26-b94505164a68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.844783 5184 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.844812 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.844821 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.844830 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kkrp4\" (UniqueName: \"kubernetes.io/projected/001b7f9f-058c-4037-af26-b94505164a68-kube-api-access-kkrp4\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.855125 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data" (OuterVolumeSpecName: "config-data") pod "001b7f9f-058c-4037-af26-b94505164a68" (UID: "001b7f9f-058c-4037-af26-b94505164a68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.901125 5184 scope.go:117] "RemoveContainer" containerID="98670b700c08ff6f9a4c2a47b41060b4256ca035570ab99ec132b7640d7f18ce" Mar 12 17:09:34 crc kubenswrapper[5184]: I0312 17:09:34.984033 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/001b7f9f-058c-4037-af26-b94505164a68-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:35 crc kubenswrapper[5184]: I0312 17:09:35.128160 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 17:09:35 crc kubenswrapper[5184]: I0312 17:09:35.138397 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 17:09:35 crc kubenswrapper[5184]: I0312 17:09:35.216789 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5865f6c4b6-r8frb" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": dial tcp 10.217.0.163:9311: connect: connection refused" Mar 12 17:09:36 crc kubenswrapper[5184]: I0312 17:09:36.485478 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-5zvch" podUID="af87b4e5-15c0-48dc-9bc3-df39fcc24a53" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 17:09:37 crc kubenswrapper[5184]: I0312 17:09:37.419365 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5865f6c4b6-r8frb" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": dial tcp 10.217.0.163:9311: connect: connection refused" Mar 12 17:09:37 crc kubenswrapper[5184]: I0312 17:09:37.579240 5184 generic.go:358] "Generic (PLEG): container finished" podID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerID="1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026" exitCode=0 Mar 12 17:09:37 crc kubenswrapper[5184]: I0312 17:09:37.992751 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.032564 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ce447b-3a61-4c39-9f58-12985dbdb754-logs\") pod \"f0ce447b-3a61-4c39-9f58-12985dbdb754\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.032736 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-combined-ca-bundle\") pod \"f0ce447b-3a61-4c39-9f58-12985dbdb754\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.032800 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data\") pod \"f0ce447b-3a61-4c39-9f58-12985dbdb754\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.032828 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data-custom\") pod \"f0ce447b-3a61-4c39-9f58-12985dbdb754\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.032973 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgt9x\" (UniqueName: \"kubernetes.io/projected/f0ce447b-3a61-4c39-9f58-12985dbdb754-kube-api-access-bgt9x\") pod \"f0ce447b-3a61-4c39-9f58-12985dbdb754\" (UID: \"f0ce447b-3a61-4c39-9f58-12985dbdb754\") " Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.033329 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ce447b-3a61-4c39-9f58-12985dbdb754-logs" (OuterVolumeSpecName: "logs") pod "f0ce447b-3a61-4c39-9f58-12985dbdb754" (UID: "f0ce447b-3a61-4c39-9f58-12985dbdb754"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.033682 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0ce447b-3a61-4c39-9f58-12985dbdb754-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.047556 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f0ce447b-3a61-4c39-9f58-12985dbdb754" (UID: "f0ce447b-3a61-4c39-9f58-12985dbdb754"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.051603 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ce447b-3a61-4c39-9f58-12985dbdb754-kube-api-access-bgt9x" (OuterVolumeSpecName: "kube-api-access-bgt9x") pod "f0ce447b-3a61-4c39-9f58-12985dbdb754" (UID: "f0ce447b-3a61-4c39-9f58-12985dbdb754"). InnerVolumeSpecName "kube-api-access-bgt9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.075549 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0ce447b-3a61-4c39-9f58-12985dbdb754" (UID: "f0ce447b-3a61-4c39-9f58-12985dbdb754"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.116364 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data" (OuterVolumeSpecName: "config-data") pod "f0ce447b-3a61-4c39-9f58-12985dbdb754" (UID: "f0ce447b-3a61-4c39-9f58-12985dbdb754"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.135795 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.135839 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.135851 5184 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0ce447b-3a61-4c39-9f58-12985dbdb754-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.135863 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bgt9x\" (UniqueName: \"kubernetes.io/projected/f0ce447b-3a61-4c39-9f58-12985dbdb754-kube-api-access-bgt9x\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.615674 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.714757 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/barbican-api-5865f6c4b6-r8frb" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.714800 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.716601 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="001b7f9f-058c-4037-af26-b94505164a68" containerName="probe" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.716640 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="001b7f9f-058c-4037-af26-b94505164a68" containerName="probe" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.716685 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="001b7f9f-058c-4037-af26-b94505164a68" containerName="cinder-scheduler" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.716701 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="001b7f9f-058c-4037-af26-b94505164a68" containerName="cinder-scheduler" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.716763 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api-log" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.716792 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api-log" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.716815 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.716826 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.717266 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.717315 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="001b7f9f-058c-4037-af26-b94505164a68" containerName="cinder-scheduler" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.717354 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="001b7f9f-058c-4037-af26-b94505164a68" containerName="probe" Mar 12 17:09:38 crc kubenswrapper[5184]: I0312 17:09:38.717401 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" containerName="barbican-api-log" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.228325 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5865f6c4b6-r8frb" event={"ID":"f0ce447b-3a61-4c39-9f58-12985dbdb754","Type":"ContainerDied","Data":"1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026"} Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.228816 5184 scope.go:117] "RemoveContainer" containerID="1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.230559 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-698bbfd847-dsp75" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.230788 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.238453 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-scheduler-config-data\"" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.243969 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001b7f9f-058c-4037-af26-b94505164a68" path="/var/lib/kubelet/pods/001b7f9f-058c-4037-af26-b94505164a68/volumes" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.244987 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.245021 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5865f6c4b6-r8frb" event={"ID":"f0ce447b-3a61-4c39-9f58-12985dbdb754","Type":"ContainerDied","Data":"6d31915f799213be7395d1bcdcd590a621310c15db9b6ec5a08aad01b18b8a17"} Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.245056 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5865f6c4b6-r8frb"] Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.245088 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5865f6c4b6-r8frb"] Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.271056 5184 scope.go:117] "RemoveContainer" containerID="46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.332656 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-scripts\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.332901 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.333532 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-config-data\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.333964 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lpmz\" (UniqueName: \"kubernetes.io/projected/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-kube-api-access-4lpmz\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.334042 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.334749 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.438018 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lpmz\" (UniqueName: \"kubernetes.io/projected/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-kube-api-access-4lpmz\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.438442 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.438620 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.438772 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-scripts\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.438900 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.439088 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-config-data\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.440680 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.447863 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.448329 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.448544 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-scripts\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.452983 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-config-data\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.459269 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lpmz\" (UniqueName: \"kubernetes.io/projected/3f28e94c-3d7e-4cfd-b230-f8939eb1e78f-kube-api-access-4lpmz\") pod \"cinder-scheduler-0\" (UID: \"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f\") " pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.476671 5184 scope.go:117] "RemoveContainer" containerID="1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026" Mar 12 17:09:39 crc kubenswrapper[5184]: E0312 17:09:39.477051 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026\": container with ID starting with 1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026 not found: ID does not exist" containerID="1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.477092 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026"} err="failed to get container status \"1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026\": rpc error: code = NotFound desc = could not find container \"1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026\": container with ID starting with 1536506db9dbcf87c5a05b0c49a09247ce101e7244e37a69f1b6dc35a4b9d026 not found: ID does not exist" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.477119 5184 scope.go:117] "RemoveContainer" containerID="46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e" Mar 12 17:09:39 crc kubenswrapper[5184]: E0312 17:09:39.477442 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e\": container with ID starting with 46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e not found: ID does not exist" containerID="46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.477472 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e"} err="failed to get container status \"46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e\": rpc error: code = NotFound desc = could not find container \"46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e\": container with ID starting with 46acf120c50da21a921741f68d8f5af275ca388bec96793cfb62f30cf1f9541e not found: ID does not exist" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.555328 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 17:09:39 crc kubenswrapper[5184]: I0312 17:09:39.640088 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b65568c5b-pr7s4" event={"ID":"d77574d5-b3c7-434b-8499-c38b3e2886e8","Type":"ContainerStarted","Data":"45e0f4689110d197ef731dfedc02c3d938945928624e3428962c508e87ebc777"} Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.010145 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.060360 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqtqq\" (UniqueName: \"kubernetes.io/projected/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-kube-api-access-pqtqq\") pod \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.060456 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-config-data\") pod \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.060537 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-combined-ca-bundle\") pod \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.060582 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-scripts\") pod \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.060683 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-sg-core-conf-yaml\") pod \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.060856 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-run-httpd\") pod \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.060881 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-log-httpd\") pod \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\" (UID: \"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.062339 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" (UID: "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.065642 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" (UID: "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.072360 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-scripts" (OuterVolumeSpecName: "scripts") pod "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" (UID: "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.103592 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-kube-api-access-pqtqq" (OuterVolumeSpecName: "kube-api-access-pqtqq") pod "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" (UID: "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1"). InnerVolumeSpecName "kube-api-access-pqtqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.161602 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.167191 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.167408 5184 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.167493 5184 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.167555 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pqtqq\" (UniqueName: \"kubernetes.io/projected/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-kube-api-access-pqtqq\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.209529 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" (UID: "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.227528 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-config-data" (OuterVolumeSpecName: "config-data") pod "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" (UID: "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.244491 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" (UID: "e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: W0312 17:09:40.258686 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f28e94c_3d7e_4cfd_b230_f8939eb1e78f.slice/crio-4f0c89c498b2a86cc4588950e70896d35f28c27654b59a8749e8452fce8fdb49 WatchSource:0}: Error finding container 4f0c89c498b2a86cc4588950e70896d35f28c27654b59a8749e8452fce8fdb49: Status 404 returned error can't find the container with id 4f0c89c498b2a86cc4588950e70896d35f28c27654b59a8749e8452fce8fdb49 Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.264930 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.268417 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-public-tls-certs\") pod \"0765e7b4-b879-4989-8a31-486408b9cdce\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.268583 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-internal-tls-certs\") pod \"0765e7b4-b879-4989-8a31-486408b9cdce\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.268740 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-ovndb-tls-certs\") pod \"0765e7b4-b879-4989-8a31-486408b9cdce\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.268802 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r9wq\" (UniqueName: \"kubernetes.io/projected/0765e7b4-b879-4989-8a31-486408b9cdce-kube-api-access-7r9wq\") pod \"0765e7b4-b879-4989-8a31-486408b9cdce\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.268848 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-config\") pod \"0765e7b4-b879-4989-8a31-486408b9cdce\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.268914 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-combined-ca-bundle\") pod \"0765e7b4-b879-4989-8a31-486408b9cdce\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.268948 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-httpd-config\") pod \"0765e7b4-b879-4989-8a31-486408b9cdce\" (UID: \"0765e7b4-b879-4989-8a31-486408b9cdce\") " Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.269248 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.269264 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.269273 5184 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.274501 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0765e7b4-b879-4989-8a31-486408b9cdce" (UID: "0765e7b4-b879-4989-8a31-486408b9cdce"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.278536 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0765e7b4-b879-4989-8a31-486408b9cdce-kube-api-access-7r9wq" (OuterVolumeSpecName: "kube-api-access-7r9wq") pod "0765e7b4-b879-4989-8a31-486408b9cdce" (UID: "0765e7b4-b879-4989-8a31-486408b9cdce"). InnerVolumeSpecName "kube-api-access-7r9wq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.322705 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-config" (OuterVolumeSpecName: "config") pod "0765e7b4-b879-4989-8a31-486408b9cdce" (UID: "0765e7b4-b879-4989-8a31-486408b9cdce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.323218 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0765e7b4-b879-4989-8a31-486408b9cdce" (UID: "0765e7b4-b879-4989-8a31-486408b9cdce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.332869 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0765e7b4-b879-4989-8a31-486408b9cdce" (UID: "0765e7b4-b879-4989-8a31-486408b9cdce"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.349163 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0765e7b4-b879-4989-8a31-486408b9cdce" (UID: "0765e7b4-b879-4989-8a31-486408b9cdce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.356179 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0765e7b4-b879-4989-8a31-486408b9cdce" (UID: "0765e7b4-b879-4989-8a31-486408b9cdce"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.370969 5184 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.371178 5184 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.371244 5184 reconciler_common.go:299] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.371299 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7r9wq\" (UniqueName: \"kubernetes.io/projected/0765e7b4-b879-4989-8a31-486408b9cdce-kube-api-access-7r9wq\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.371371 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.371462 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.371523 5184 reconciler_common.go:299] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0765e7b4-b879-4989-8a31-486408b9cdce-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.415839 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ce447b-3a61-4c39-9f58-12985dbdb754" path="/var/lib/kubelet/pods/f0ce447b-3a61-4c39-9f58-12985dbdb754/volumes" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.651905 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f","Type":"ContainerStarted","Data":"4f0c89c498b2a86cc4588950e70896d35f28c27654b59a8749e8452fce8fdb49"} Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.656017 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b69fb49f7-dmgsd" event={"ID":"0765e7b4-b879-4989-8a31-486408b9cdce","Type":"ContainerDied","Data":"02651e41906adbf9399ea4315baee55d5d8dc38b6405087e216eadb03fdfa330"} Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.656030 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b69fb49f7-dmgsd" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.656050 5184 scope.go:117] "RemoveContainer" containerID="57ce9289747a13102e2b8b4676fe53e2205ceb948f0161b7a051a28d12aeb302" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.661037 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b65568c5b-pr7s4" event={"ID":"d77574d5-b3c7-434b-8499-c38b3e2886e8","Type":"ContainerStarted","Data":"51936cefc74000af3ed7c5a7aeba320150ae2839866b1f44f50b8b1cd70bfaba"} Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.661802 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.661842 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.678729 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.678727 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1","Type":"ContainerDied","Data":"feb72b544bc8f663ca8265f1fa1967854078871065d0ee851bc7a90413f9e18a"} Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.682073 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b69fb49f7-dmgsd"] Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.690831 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b69fb49f7-dmgsd"] Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.707766 5184 scope.go:117] "RemoveContainer" containerID="3b99a04cd7775019fa369700f675593f34d3fef966fdf724f5a08caef1917514" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.720451 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5b65568c5b-pr7s4" podStartSLOduration=9.720430327 podStartE2EDuration="9.720430327s" podCreationTimestamp="2026-03-12 17:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:09:40.705070305 +0000 UTC m=+1123.246381644" watchObservedRunningTime="2026-03-12 17:09:40.720430327 +0000 UTC m=+1123.261741666" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.727740 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.751562 5184 scope.go:117] "RemoveContainer" containerID="e1931efe8ab98f45a77d4e839d5f5c8986c94bac3ecf66644a727ca2ff79c21c" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.757365 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.768729 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769749 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="sg-core" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769771 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="sg-core" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769798 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0765e7b4-b879-4989-8a31-486408b9cdce" containerName="neutron-httpd" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769807 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="0765e7b4-b879-4989-8a31-486408b9cdce" containerName="neutron-httpd" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769823 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0765e7b4-b879-4989-8a31-486408b9cdce" containerName="neutron-api" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769830 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="0765e7b4-b879-4989-8a31-486408b9cdce" containerName="neutron-api" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769851 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="ceilometer-notification-agent" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769858 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="ceilometer-notification-agent" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769887 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="proxy-httpd" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769893 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="proxy-httpd" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769906 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="ceilometer-central-agent" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.769912 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="ceilometer-central-agent" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.770096 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="0765e7b4-b879-4989-8a31-486408b9cdce" containerName="neutron-api" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.770113 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="0765e7b4-b879-4989-8a31-486408b9cdce" containerName="neutron-httpd" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.770125 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="proxy-httpd" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.770135 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="ceilometer-notification-agent" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.770148 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="ceilometer-central-agent" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.770156 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" containerName="sg-core" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.781067 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.781406 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.785494 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.787790 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.811737 5184 scope.go:117] "RemoveContainer" containerID="95f938870d1a66767fdd5c7147c287517381b4b5d229780159a9e039c953ef07" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.846209 5184 scope.go:117] "RemoveContainer" containerID="943e06caa8e5d76068f507ba5cf5ebb9180ad92638b32bfce25c70ab2ff245f1" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.882325 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-config-data\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.882503 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.882531 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tck5f\" (UniqueName: \"kubernetes.io/projected/ecc85f3c-6fc0-4331-bdbc-e457308457f8-kube-api-access-tck5f\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.882551 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.882580 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-scripts\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.882651 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-log-httpd\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.882677 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-run-httpd\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.883917 5184 scope.go:117] "RemoveContainer" containerID="229c265b683e912c620e9e0f730acb9df2d8475e613170a912a122643740e27e" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.984505 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-scripts\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.984599 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-log-httpd\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.984632 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-run-httpd\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.984693 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-config-data\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.984797 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.984827 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tck5f\" (UniqueName: \"kubernetes.io/projected/ecc85f3c-6fc0-4331-bdbc-e457308457f8-kube-api-access-tck5f\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.984849 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.985156 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-log-httpd\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.985417 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-run-httpd\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.991338 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.991516 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-scripts\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:40 crc kubenswrapper[5184]: I0312 17:09:40.992976 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-config-data\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:41 crc kubenswrapper[5184]: I0312 17:09:41.001162 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:41 crc kubenswrapper[5184]: I0312 17:09:41.002834 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tck5f\" (UniqueName: \"kubernetes.io/projected/ecc85f3c-6fc0-4331-bdbc-e457308457f8-kube-api-access-tck5f\") pod \"ceilometer-0\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " pod="openstack/ceilometer-0" Mar 12 17:09:41 crc kubenswrapper[5184]: I0312 17:09:41.109302 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:09:41 crc kubenswrapper[5184]: I0312 17:09:41.601226 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:41 crc kubenswrapper[5184]: W0312 17:09:41.608428 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecc85f3c_6fc0_4331_bdbc_e457308457f8.slice/crio-7bdac34b2319c37699d076a94c4c93f0643336ca8aec3229409c86270a277b1a WatchSource:0}: Error finding container 7bdac34b2319c37699d076a94c4c93f0643336ca8aec3229409c86270a277b1a: Status 404 returned error can't find the container with id 7bdac34b2319c37699d076a94c4c93f0643336ca8aec3229409c86270a277b1a Mar 12 17:09:41 crc kubenswrapper[5184]: I0312 17:09:41.690186 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecc85f3c-6fc0-4331-bdbc-e457308457f8","Type":"ContainerStarted","Data":"7bdac34b2319c37699d076a94c4c93f0643336ca8aec3229409c86270a277b1a"} Mar 12 17:09:41 crc kubenswrapper[5184]: I0312 17:09:41.695863 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f","Type":"ContainerStarted","Data":"6a07e49b19d8f4d0ddbdce7f15f78428412d6ad8d2da16a50c929e4c3286f086"} Mar 12 17:09:42 crc kubenswrapper[5184]: I0312 17:09:42.413614 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0765e7b4-b879-4989-8a31-486408b9cdce" path="/var/lib/kubelet/pods/0765e7b4-b879-4989-8a31-486408b9cdce/volumes" Mar 12 17:09:42 crc kubenswrapper[5184]: I0312 17:09:42.414934 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1" path="/var/lib/kubelet/pods/e5185f6d-1f8d-4f3c-a14a-33a1f21db0d1/volumes" Mar 12 17:09:42 crc kubenswrapper[5184]: I0312 17:09:42.708988 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f28e94c-3d7e-4cfd-b230-f8939eb1e78f","Type":"ContainerStarted","Data":"98a7bee564adb70a2466aa8d8a5056f3beae6eb7a0c6437cc975317f24d5360e"} Mar 12 17:09:42 crc kubenswrapper[5184]: I0312 17:09:42.710566 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecc85f3c-6fc0-4331-bdbc-e457308457f8","Type":"ContainerStarted","Data":"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b"} Mar 12 17:09:42 crc kubenswrapper[5184]: I0312 17:09:42.733317 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.733299765 podStartE2EDuration="7.733299765s" podCreationTimestamp="2026-03-12 17:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:09:42.728075181 +0000 UTC m=+1125.269386520" watchObservedRunningTime="2026-03-12 17:09:42.733299765 +0000 UTC m=+1125.274611104" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.099505 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.108526 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.113087 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.114540 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-config-secret\"" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.114815 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-config\"" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.114987 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstackclient-openstackclient-dockercfg-j4zhq\"" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.226995 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c0d941e-36d1-4112-8488-a27d08ec0a8b-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.227190 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-629q9\" (UniqueName: \"kubernetes.io/projected/3c0d941e-36d1-4112-8488-a27d08ec0a8b-kube-api-access-629q9\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.227457 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0d941e-36d1-4112-8488-a27d08ec0a8b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.227584 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c0d941e-36d1-4112-8488-a27d08ec0a8b-openstack-config\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.337211 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c0d941e-36d1-4112-8488-a27d08ec0a8b-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.337292 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-629q9\" (UniqueName: \"kubernetes.io/projected/3c0d941e-36d1-4112-8488-a27d08ec0a8b-kube-api-access-629q9\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.337363 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0d941e-36d1-4112-8488-a27d08ec0a8b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.345484 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c0d941e-36d1-4112-8488-a27d08ec0a8b-openstack-config\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.346590 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c0d941e-36d1-4112-8488-a27d08ec0a8b-openstack-config\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.360772 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c0d941e-36d1-4112-8488-a27d08ec0a8b-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.361194 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-629q9\" (UniqueName: \"kubernetes.io/projected/3c0d941e-36d1-4112-8488-a27d08ec0a8b-kube-api-access-629q9\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.369113 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0d941e-36d1-4112-8488-a27d08ec0a8b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3c0d941e-36d1-4112-8488-a27d08ec0a8b\") " pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.429971 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.750145 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecc85f3c-6fc0-4331-bdbc-e457308457f8","Type":"ContainerStarted","Data":"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871"} Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.770090 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-756587dd69-bfms9"] Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.813640 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-756587dd69-bfms9"] Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.813797 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.816637 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"swift-proxy-config-data\"" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.818658 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-swift-internal-svc\"" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.825442 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-swift-public-svc\"" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.939547 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.970060 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-config-data\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.970142 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-public-tls-certs\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.970197 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ac8b1de-edf1-4663-b45b-677f2cd049eb-run-httpd\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.970224 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ac8b1de-edf1-4663-b45b-677f2cd049eb-log-httpd\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.970273 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-combined-ca-bundle\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.970295 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qllz\" (UniqueName: \"kubernetes.io/projected/4ac8b1de-edf1-4663-b45b-677f2cd049eb-kube-api-access-4qllz\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.970310 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4ac8b1de-edf1-4663-b45b-677f2cd049eb-etc-swift\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:43 crc kubenswrapper[5184]: I0312 17:09:43.970330 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-internal-tls-certs\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.071681 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-combined-ca-bundle\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.071726 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qllz\" (UniqueName: \"kubernetes.io/projected/4ac8b1de-edf1-4663-b45b-677f2cd049eb-kube-api-access-4qllz\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.071746 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4ac8b1de-edf1-4663-b45b-677f2cd049eb-etc-swift\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.071779 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-internal-tls-certs\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.071840 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-config-data\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.071879 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-public-tls-certs\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.071934 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ac8b1de-edf1-4663-b45b-677f2cd049eb-run-httpd\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.071957 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ac8b1de-edf1-4663-b45b-677f2cd049eb-log-httpd\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.072593 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ac8b1de-edf1-4663-b45b-677f2cd049eb-log-httpd\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.075894 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ac8b1de-edf1-4663-b45b-677f2cd049eb-run-httpd\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.080214 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-config-data\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.080207 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-combined-ca-bundle\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.080856 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-public-tls-certs\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.081136 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4ac8b1de-edf1-4663-b45b-677f2cd049eb-etc-swift\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.082187 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac8b1de-edf1-4663-b45b-677f2cd049eb-internal-tls-certs\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.097265 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qllz\" (UniqueName: \"kubernetes.io/projected/4ac8b1de-edf1-4663-b45b-677f2cd049eb-kube-api-access-4qllz\") pod \"swift-proxy-756587dd69-bfms9\" (UID: \"4ac8b1de-edf1-4663-b45b-677f2cd049eb\") " pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.149093 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.557018 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.716096 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-756587dd69-bfms9"] Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.770847 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3c0d941e-36d1-4112-8488-a27d08ec0a8b","Type":"ContainerStarted","Data":"4f68f8ab6eca8bf1c7fb4d87239a1875453e4bc34d111edb8a29a56d1f9af463"} Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.772663 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-756587dd69-bfms9" event={"ID":"4ac8b1de-edf1-4663-b45b-677f2cd049eb","Type":"ContainerStarted","Data":"74e128abcbf3abccf0c13f4b9ec016eb7283ed6127e421d3d95b9afcccb2930f"} Mar 12 17:09:44 crc kubenswrapper[5184]: I0312 17:09:44.776680 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecc85f3c-6fc0-4331-bdbc-e457308457f8","Type":"ContainerStarted","Data":"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f"} Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.358610 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.540139 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hmwnz"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.544740 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hmwnz" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.549461 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hmwnz"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.643647 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-n4crd"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.650216 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n4crd" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.665730 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-n4crd"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.682460 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-26d7-account-create-update-ms2gh"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.690370 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-26d7-account-create-update-ms2gh" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.692701 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-db-secret\"" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.700007 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-26d7-account-create-update-ms2gh"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.713350 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4884f23-d147-46ea-a562-1a772dbd1c21-operator-scripts\") pod \"nova-api-db-create-hmwnz\" (UID: \"a4884f23-d147-46ea-a562-1a772dbd1c21\") " pod="openstack/nova-api-db-create-hmwnz" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.713685 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clgwf\" (UniqueName: \"kubernetes.io/projected/a4884f23-d147-46ea-a562-1a772dbd1c21-kube-api-access-clgwf\") pod \"nova-api-db-create-hmwnz\" (UID: \"a4884f23-d147-46ea-a562-1a772dbd1c21\") " pod="openstack/nova-api-db-create-hmwnz" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.801654 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mlrgv"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.807740 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-756587dd69-bfms9" event={"ID":"4ac8b1de-edf1-4663-b45b-677f2cd049eb","Type":"ContainerStarted","Data":"3758f3e2898479dda4c0a65c513137d05c1c7d284e0454b868b737408a9424d2"} Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.808036 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-756587dd69-bfms9" event={"ID":"4ac8b1de-edf1-4663-b45b-677f2cd049eb","Type":"ContainerStarted","Data":"ac4817096e6b07d52d96a978b2f4e716ee20e02b77912edb66e5d78a93639288"} Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.808112 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.808172 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.808313 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mlrgv" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.815453 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clgwf\" (UniqueName: \"kubernetes.io/projected/a4884f23-d147-46ea-a562-1a772dbd1c21-kube-api-access-clgwf\") pod \"nova-api-db-create-hmwnz\" (UID: \"a4884f23-d147-46ea-a562-1a772dbd1c21\") " pod="openstack/nova-api-db-create-hmwnz" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.835463 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-756587dd69-bfms9" podStartSLOduration=2.8354396729999998 podStartE2EDuration="2.835439673s" podCreationTimestamp="2026-03-12 17:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:09:45.818894443 +0000 UTC m=+1128.360205782" watchObservedRunningTime="2026-03-12 17:09:45.835439673 +0000 UTC m=+1128.376751012" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.831148 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mlrgv"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.835485 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/504f9845-df1d-48a7-badf-ea8ed99ff8a5-operator-scripts\") pod \"nova-api-26d7-account-create-update-ms2gh\" (UID: \"504f9845-df1d-48a7-badf-ea8ed99ff8a5\") " pod="openstack/nova-api-26d7-account-create-update-ms2gh" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.838576 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4884f23-d147-46ea-a562-1a772dbd1c21-operator-scripts\") pod \"nova-api-db-create-hmwnz\" (UID: \"a4884f23-d147-46ea-a562-1a772dbd1c21\") " pod="openstack/nova-api-db-create-hmwnz" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.838759 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-operator-scripts\") pod \"nova-cell0-db-create-n4crd\" (UID: \"06c6acd8-4187-4ca0-ba38-0035df2f3d0c\") " pod="openstack/nova-cell0-db-create-n4crd" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.838781 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2jw\" (UniqueName: \"kubernetes.io/projected/504f9845-df1d-48a7-badf-ea8ed99ff8a5-kube-api-access-vk2jw\") pod \"nova-api-26d7-account-create-update-ms2gh\" (UID: \"504f9845-df1d-48a7-badf-ea8ed99ff8a5\") " pod="openstack/nova-api-26d7-account-create-update-ms2gh" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.838851 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z4v9\" (UniqueName: \"kubernetes.io/projected/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-kube-api-access-9z4v9\") pod \"nova-cell0-db-create-n4crd\" (UID: \"06c6acd8-4187-4ca0-ba38-0035df2f3d0c\") " pod="openstack/nova-cell0-db-create-n4crd" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.839650 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4884f23-d147-46ea-a562-1a772dbd1c21-operator-scripts\") pod \"nova-api-db-create-hmwnz\" (UID: \"a4884f23-d147-46ea-a562-1a772dbd1c21\") " pod="openstack/nova-api-db-create-hmwnz" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.864370 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7e8d-account-create-update-bqsxc"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.865447 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clgwf\" (UniqueName: \"kubernetes.io/projected/a4884f23-d147-46ea-a562-1a772dbd1c21-kube-api-access-clgwf\") pod \"nova-api-db-create-hmwnz\" (UID: \"a4884f23-d147-46ea-a562-1a772dbd1c21\") " pod="openstack/nova-api-db-create-hmwnz" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.866971 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hmwnz" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.871675 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.874791 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-db-secret\"" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.905589 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7e8d-account-create-update-bqsxc"] Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.941279 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/504f9845-df1d-48a7-badf-ea8ed99ff8a5-operator-scripts\") pod \"nova-api-26d7-account-create-update-ms2gh\" (UID: \"504f9845-df1d-48a7-badf-ea8ed99ff8a5\") " pod="openstack/nova-api-26d7-account-create-update-ms2gh" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.941612 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt67w\" (UniqueName: \"kubernetes.io/projected/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-kube-api-access-jt67w\") pod \"nova-cell1-db-create-mlrgv\" (UID: \"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef\") " pod="openstack/nova-cell1-db-create-mlrgv" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.941714 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-operator-scripts\") pod \"nova-cell1-db-create-mlrgv\" (UID: \"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef\") " pod="openstack/nova-cell1-db-create-mlrgv" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.941862 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-operator-scripts\") pod \"nova-cell0-db-create-n4crd\" (UID: \"06c6acd8-4187-4ca0-ba38-0035df2f3d0c\") " pod="openstack/nova-cell0-db-create-n4crd" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.941900 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2jw\" (UniqueName: \"kubernetes.io/projected/504f9845-df1d-48a7-badf-ea8ed99ff8a5-kube-api-access-vk2jw\") pod \"nova-api-26d7-account-create-update-ms2gh\" (UID: \"504f9845-df1d-48a7-badf-ea8ed99ff8a5\") " pod="openstack/nova-api-26d7-account-create-update-ms2gh" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.942040 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9z4v9\" (UniqueName: \"kubernetes.io/projected/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-kube-api-access-9z4v9\") pod \"nova-cell0-db-create-n4crd\" (UID: \"06c6acd8-4187-4ca0-ba38-0035df2f3d0c\") " pod="openstack/nova-cell0-db-create-n4crd" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.944745 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/504f9845-df1d-48a7-badf-ea8ed99ff8a5-operator-scripts\") pod \"nova-api-26d7-account-create-update-ms2gh\" (UID: \"504f9845-df1d-48a7-badf-ea8ed99ff8a5\") " pod="openstack/nova-api-26d7-account-create-update-ms2gh" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.945255 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-operator-scripts\") pod \"nova-cell0-db-create-n4crd\" (UID: \"06c6acd8-4187-4ca0-ba38-0035df2f3d0c\") " pod="openstack/nova-cell0-db-create-n4crd" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.962301 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2jw\" (UniqueName: \"kubernetes.io/projected/504f9845-df1d-48a7-badf-ea8ed99ff8a5-kube-api-access-vk2jw\") pod \"nova-api-26d7-account-create-update-ms2gh\" (UID: \"504f9845-df1d-48a7-badf-ea8ed99ff8a5\") " pod="openstack/nova-api-26d7-account-create-update-ms2gh" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.964127 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z4v9\" (UniqueName: \"kubernetes.io/projected/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-kube-api-access-9z4v9\") pod \"nova-cell0-db-create-n4crd\" (UID: \"06c6acd8-4187-4ca0-ba38-0035df2f3d0c\") " pod="openstack/nova-cell0-db-create-n4crd" Mar 12 17:09:45 crc kubenswrapper[5184]: I0312 17:09:45.973362 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n4crd" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.038721 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-26d7-account-create-update-ms2gh" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.045581 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ds2l\" (UniqueName: \"kubernetes.io/projected/712022b6-f003-4d70-bb26-978c06c35480-kube-api-access-4ds2l\") pod \"nova-cell0-7e8d-account-create-update-bqsxc\" (UID: \"712022b6-f003-4d70-bb26-978c06c35480\") " pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.047151 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/712022b6-f003-4d70-bb26-978c06c35480-operator-scripts\") pod \"nova-cell0-7e8d-account-create-update-bqsxc\" (UID: \"712022b6-f003-4d70-bb26-978c06c35480\") " pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.047203 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jt67w\" (UniqueName: \"kubernetes.io/projected/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-kube-api-access-jt67w\") pod \"nova-cell1-db-create-mlrgv\" (UID: \"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef\") " pod="openstack/nova-cell1-db-create-mlrgv" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.047485 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-operator-scripts\") pod \"nova-cell1-db-create-mlrgv\" (UID: \"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef\") " pod="openstack/nova-cell1-db-create-mlrgv" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.048293 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-operator-scripts\") pod \"nova-cell1-db-create-mlrgv\" (UID: \"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef\") " pod="openstack/nova-cell1-db-create-mlrgv" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.076563 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-152a-account-create-update-qpqrz"] Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.085484 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt67w\" (UniqueName: \"kubernetes.io/projected/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-kube-api-access-jt67w\") pod \"nova-cell1-db-create-mlrgv\" (UID: \"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef\") " pod="openstack/nova-cell1-db-create-mlrgv" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.092522 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-152a-account-create-update-qpqrz" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.095345 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-db-secret\"" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.102931 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-152a-account-create-update-qpqrz"] Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.156218 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/712022b6-f003-4d70-bb26-978c06c35480-operator-scripts\") pod \"nova-cell0-7e8d-account-create-update-bqsxc\" (UID: \"712022b6-f003-4d70-bb26-978c06c35480\") " pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.156360 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e00da5-efff-455c-b1a5-63ce04f03c55-operator-scripts\") pod \"nova-cell1-152a-account-create-update-qpqrz\" (UID: \"c4e00da5-efff-455c-b1a5-63ce04f03c55\") " pod="openstack/nova-cell1-152a-account-create-update-qpqrz" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.156504 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qwg\" (UniqueName: \"kubernetes.io/projected/c4e00da5-efff-455c-b1a5-63ce04f03c55-kube-api-access-z7qwg\") pod \"nova-cell1-152a-account-create-update-qpqrz\" (UID: \"c4e00da5-efff-455c-b1a5-63ce04f03c55\") " pod="openstack/nova-cell1-152a-account-create-update-qpqrz" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.156685 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ds2l\" (UniqueName: \"kubernetes.io/projected/712022b6-f003-4d70-bb26-978c06c35480-kube-api-access-4ds2l\") pod \"nova-cell0-7e8d-account-create-update-bqsxc\" (UID: \"712022b6-f003-4d70-bb26-978c06c35480\") " pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.157726 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/712022b6-f003-4d70-bb26-978c06c35480-operator-scripts\") pod \"nova-cell0-7e8d-account-create-update-bqsxc\" (UID: \"712022b6-f003-4d70-bb26-978c06c35480\") " pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.183414 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ds2l\" (UniqueName: \"kubernetes.io/projected/712022b6-f003-4d70-bb26-978c06c35480-kube-api-access-4ds2l\") pod \"nova-cell0-7e8d-account-create-update-bqsxc\" (UID: \"712022b6-f003-4d70-bb26-978c06c35480\") " pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.260819 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e00da5-efff-455c-b1a5-63ce04f03c55-operator-scripts\") pod \"nova-cell1-152a-account-create-update-qpqrz\" (UID: \"c4e00da5-efff-455c-b1a5-63ce04f03c55\") " pod="openstack/nova-cell1-152a-account-create-update-qpqrz" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.260881 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qwg\" (UniqueName: \"kubernetes.io/projected/c4e00da5-efff-455c-b1a5-63ce04f03c55-kube-api-access-z7qwg\") pod \"nova-cell1-152a-account-create-update-qpqrz\" (UID: \"c4e00da5-efff-455c-b1a5-63ce04f03c55\") " pod="openstack/nova-cell1-152a-account-create-update-qpqrz" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.262034 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e00da5-efff-455c-b1a5-63ce04f03c55-operator-scripts\") pod \"nova-cell1-152a-account-create-update-qpqrz\" (UID: \"c4e00da5-efff-455c-b1a5-63ce04f03c55\") " pod="openstack/nova-cell1-152a-account-create-update-qpqrz" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.296137 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qwg\" (UniqueName: \"kubernetes.io/projected/c4e00da5-efff-455c-b1a5-63ce04f03c55-kube-api-access-z7qwg\") pod \"nova-cell1-152a-account-create-update-qpqrz\" (UID: \"c4e00da5-efff-455c-b1a5-63ce04f03c55\") " pod="openstack/nova-cell1-152a-account-create-update-qpqrz" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.334697 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mlrgv" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.370064 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.414700 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-152a-account-create-update-qpqrz" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.480361 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hmwnz"] Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.599038 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-n4crd"] Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.705073 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-26d7-account-create-update-ms2gh"] Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.900808 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hmwnz" event={"ID":"a4884f23-d147-46ea-a562-1a772dbd1c21","Type":"ContainerStarted","Data":"a9d34989e9aac211e813befc0020b796a763a7776fd6f9356a325dbea4fe8d30"} Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.904521 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecc85f3c-6fc0-4331-bdbc-e457308457f8","Type":"ContainerStarted","Data":"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3"} Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.904686 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="ceilometer-central-agent" containerID="cri-o://3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b" gracePeriod=30 Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.904842 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.904936 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="proxy-httpd" containerID="cri-o://56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3" gracePeriod=30 Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.904981 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="sg-core" containerID="cri-o://7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f" gracePeriod=30 Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.905013 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="ceilometer-notification-agent" containerID="cri-o://14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871" gracePeriod=30 Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.910197 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n4crd" event={"ID":"06c6acd8-4187-4ca0-ba38-0035df2f3d0c","Type":"ContainerStarted","Data":"d7d2a4bc1482dc5f71b5f3ddb99c0987874a12d6747b811b33154e920994a870"} Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.918599 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-26d7-account-create-update-ms2gh" event={"ID":"504f9845-df1d-48a7-badf-ea8ed99ff8a5","Type":"ContainerStarted","Data":"adca65a45c0e8e85c40e5e0d658f5815de232c4fdf47463acfe7c19a97fce1d1"} Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.933196 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mlrgv"] Mar 12 17:09:46 crc kubenswrapper[5184]: I0312 17:09:46.937767 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.793726723 podStartE2EDuration="6.937747183s" podCreationTimestamp="2026-03-12 17:09:40 +0000 UTC" firstStartedPulling="2026-03-12 17:09:41.609971925 +0000 UTC m=+1124.151283264" lastFinishedPulling="2026-03-12 17:09:45.753992385 +0000 UTC m=+1128.295303724" observedRunningTime="2026-03-12 17:09:46.931919659 +0000 UTC m=+1129.473230998" watchObservedRunningTime="2026-03-12 17:09:46.937747183 +0000 UTC m=+1129.479058522" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.159043 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-152a-account-create-update-qpqrz"] Mar 12 17:09:47 crc kubenswrapper[5184]: W0312 17:09:47.173041 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4e00da5_efff_455c_b1a5_63ce04f03c55.slice/crio-20488757fe24723bce322d782861efac8054faf32ae75b49ddeec5518f4340bd WatchSource:0}: Error finding container 20488757fe24723bce322d782861efac8054faf32ae75b49ddeec5518f4340bd: Status 404 returned error can't find the container with id 20488757fe24723bce322d782861efac8054faf32ae75b49ddeec5518f4340bd Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.294041 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7e8d-account-create-update-bqsxc"] Mar 12 17:09:47 crc kubenswrapper[5184]: W0312 17:09:47.387582 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod712022b6_f003_4d70_bb26_978c06c35480.slice/crio-610479ae726d272fba41a2afe8d856ac3df44a4201ee7957f1d772b4fcda95e3 WatchSource:0}: Error finding container 610479ae726d272fba41a2afe8d856ac3df44a4201ee7957f1d772b4fcda95e3: Status 404 returned error can't find the container with id 610479ae726d272fba41a2afe8d856ac3df44a4201ee7957f1d772b4fcda95e3 Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.729081 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.796608 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-scripts\") pod \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.796878 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-combined-ca-bundle\") pod \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.796946 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-log-httpd\") pod \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.796979 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tck5f\" (UniqueName: \"kubernetes.io/projected/ecc85f3c-6fc0-4331-bdbc-e457308457f8-kube-api-access-tck5f\") pod \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.797005 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-sg-core-conf-yaml\") pod \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.797066 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-config-data\") pod \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.797120 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-run-httpd\") pod \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\" (UID: \"ecc85f3c-6fc0-4331-bdbc-e457308457f8\") " Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.799948 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ecc85f3c-6fc0-4331-bdbc-e457308457f8" (UID: "ecc85f3c-6fc0-4331-bdbc-e457308457f8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.800043 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ecc85f3c-6fc0-4331-bdbc-e457308457f8" (UID: "ecc85f3c-6fc0-4331-bdbc-e457308457f8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.805816 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-scripts" (OuterVolumeSpecName: "scripts") pod "ecc85f3c-6fc0-4331-bdbc-e457308457f8" (UID: "ecc85f3c-6fc0-4331-bdbc-e457308457f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.812548 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc85f3c-6fc0-4331-bdbc-e457308457f8-kube-api-access-tck5f" (OuterVolumeSpecName: "kube-api-access-tck5f") pod "ecc85f3c-6fc0-4331-bdbc-e457308457f8" (UID: "ecc85f3c-6fc0-4331-bdbc-e457308457f8"). InnerVolumeSpecName "kube-api-access-tck5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.833171 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ecc85f3c-6fc0-4331-bdbc-e457308457f8" (UID: "ecc85f3c-6fc0-4331-bdbc-e457308457f8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.883893 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecc85f3c-6fc0-4331-bdbc-e457308457f8" (UID: "ecc85f3c-6fc0-4331-bdbc-e457308457f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.901851 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.901896 5184 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.901912 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tck5f\" (UniqueName: \"kubernetes.io/projected/ecc85f3c-6fc0-4331-bdbc-e457308457f8-kube-api-access-tck5f\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.901924 5184 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.901932 5184 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecc85f3c-6fc0-4331-bdbc-e457308457f8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.901944 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.919900 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-config-data" (OuterVolumeSpecName: "config-data") pod "ecc85f3c-6fc0-4331-bdbc-e457308457f8" (UID: "ecc85f3c-6fc0-4331-bdbc-e457308457f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.952775 5184 generic.go:358] "Generic (PLEG): container finished" podID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerID="56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3" exitCode=0 Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.952816 5184 generic.go:358] "Generic (PLEG): container finished" podID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerID="7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f" exitCode=2 Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.952826 5184 generic.go:358] "Generic (PLEG): container finished" podID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerID="14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871" exitCode=0 Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.952833 5184 generic.go:358] "Generic (PLEG): container finished" podID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerID="3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b" exitCode=0 Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.953023 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecc85f3c-6fc0-4331-bdbc-e457308457f8","Type":"ContainerDied","Data":"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3"} Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.953067 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecc85f3c-6fc0-4331-bdbc-e457308457f8","Type":"ContainerDied","Data":"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f"} Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.953085 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecc85f3c-6fc0-4331-bdbc-e457308457f8","Type":"ContainerDied","Data":"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871"} Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.953096 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecc85f3c-6fc0-4331-bdbc-e457308457f8","Type":"ContainerDied","Data":"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b"} Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.953107 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecc85f3c-6fc0-4331-bdbc-e457308457f8","Type":"ContainerDied","Data":"7bdac34b2319c37699d076a94c4c93f0643336ca8aec3229409c86270a277b1a"} Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.953125 5184 scope.go:117] "RemoveContainer" containerID="56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.953335 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.968862 5184 generic.go:358] "Generic (PLEG): container finished" podID="c4e00da5-efff-455c-b1a5-63ce04f03c55" containerID="230d9d524a767785af5ad2d095b8dff4b409d7d0584498d2fadf7203b751c2d9" exitCode=0 Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.969113 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-152a-account-create-update-qpqrz" event={"ID":"c4e00da5-efff-455c-b1a5-63ce04f03c55","Type":"ContainerDied","Data":"230d9d524a767785af5ad2d095b8dff4b409d7d0584498d2fadf7203b751c2d9"} Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.969144 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-152a-account-create-update-qpqrz" event={"ID":"c4e00da5-efff-455c-b1a5-63ce04f03c55","Type":"ContainerStarted","Data":"20488757fe24723bce322d782861efac8054faf32ae75b49ddeec5518f4340bd"} Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.990297 5184 generic.go:358] "Generic (PLEG): container finished" podID="06c6acd8-4187-4ca0-ba38-0035df2f3d0c" containerID="1d6d4a170aec1af79b1e21bb010771337c81d445023e4689a46e58b41e30d707" exitCode=0 Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.990721 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n4crd" event={"ID":"06c6acd8-4187-4ca0-ba38-0035df2f3d0c","Type":"ContainerDied","Data":"1d6d4a170aec1af79b1e21bb010771337c81d445023e4689a46e58b41e30d707"} Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.992858 5184 generic.go:358] "Generic (PLEG): container finished" podID="1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef" containerID="e688ebc45cb33ef5815d84d4e5465b82de772dcd29933526acfdf86706fc333e" exitCode=0 Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.992938 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mlrgv" event={"ID":"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef","Type":"ContainerDied","Data":"e688ebc45cb33ef5815d84d4e5465b82de772dcd29933526acfdf86706fc333e"} Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.992964 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mlrgv" event={"ID":"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef","Type":"ContainerStarted","Data":"f015cab2057eb1f9fc90e343c596aad81f265ec5f633421029f31dd97557a6d8"} Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.994586 5184 generic.go:358] "Generic (PLEG): container finished" podID="504f9845-df1d-48a7-badf-ea8ed99ff8a5" containerID="c73147534990a5f5924559fb3d15c0624e0be82b7c3fa440913e8452f0cabac3" exitCode=0 Mar 12 17:09:47 crc kubenswrapper[5184]: I0312 17:09:47.994657 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-26d7-account-create-update-ms2gh" event={"ID":"504f9845-df1d-48a7-badf-ea8ed99ff8a5","Type":"ContainerDied","Data":"c73147534990a5f5924559fb3d15c0624e0be82b7c3fa440913e8452f0cabac3"} Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.001275 5184 generic.go:358] "Generic (PLEG): container finished" podID="a4884f23-d147-46ea-a562-1a772dbd1c21" containerID="60dd861274f121fd40ead82085684d0b9356e4bf2b7d30cf355b408cfb655b9b" exitCode=0 Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.002501 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hmwnz" event={"ID":"a4884f23-d147-46ea-a562-1a772dbd1c21","Type":"ContainerDied","Data":"60dd861274f121fd40ead82085684d0b9356e4bf2b7d30cf355b408cfb655b9b"} Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.004571 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc85f3c-6fc0-4331-bdbc-e457308457f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.006035 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" event={"ID":"712022b6-f003-4d70-bb26-978c06c35480","Type":"ContainerStarted","Data":"7a5d9d7da184cc41d907b47d6a4ff6d494e923c3052137edfdca564769ae4879"} Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.006071 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" event={"ID":"712022b6-f003-4d70-bb26-978c06c35480","Type":"ContainerStarted","Data":"610479ae726d272fba41a2afe8d856ac3df44a4201ee7957f1d772b4fcda95e3"} Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.125278 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.133521 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.140358 5184 scope.go:117] "RemoveContainer" containerID="7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.154730 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.155878 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="ceilometer-notification-agent" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.155901 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="ceilometer-notification-agent" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.155921 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="sg-core" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.155929 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="sg-core" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.155941 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="ceilometer-central-agent" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.155948 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="ceilometer-central-agent" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.155966 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="proxy-httpd" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.155973 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="proxy-httpd" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.156174 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="proxy-httpd" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.156198 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="ceilometer-notification-agent" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.156216 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="ceilometer-central-agent" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.156230 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" containerName="sg-core" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.165850 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.172437 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.173305 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.179115 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.206393 5184 scope.go:117] "RemoveContainer" containerID="14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.209509 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.209579 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.209629 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-log-httpd\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.209669 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-run-httpd\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.209695 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-config-data\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.209829 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzfxq\" (UniqueName: \"kubernetes.io/projected/592f368e-3812-49cf-9dbf-60c7b5d95a48-kube-api-access-bzfxq\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.209910 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-scripts\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.227153 5184 scope.go:117] "RemoveContainer" containerID="3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.252990 5184 scope.go:117] "RemoveContainer" containerID="56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3" Mar 12 17:09:48 crc kubenswrapper[5184]: E0312 17:09:48.253436 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3\": container with ID starting with 56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3 not found: ID does not exist" containerID="56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.253478 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3"} err="failed to get container status \"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3\": rpc error: code = NotFound desc = could not find container \"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3\": container with ID starting with 56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3 not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.253502 5184 scope.go:117] "RemoveContainer" containerID="7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f" Mar 12 17:09:48 crc kubenswrapper[5184]: E0312 17:09:48.254274 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f\": container with ID starting with 7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f not found: ID does not exist" containerID="7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.254386 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f"} err="failed to get container status \"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f\": rpc error: code = NotFound desc = could not find container \"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f\": container with ID starting with 7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.254416 5184 scope.go:117] "RemoveContainer" containerID="14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871" Mar 12 17:09:48 crc kubenswrapper[5184]: E0312 17:09:48.254687 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871\": container with ID starting with 14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871 not found: ID does not exist" containerID="14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.254714 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871"} err="failed to get container status \"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871\": rpc error: code = NotFound desc = could not find container \"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871\": container with ID starting with 14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871 not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.254731 5184 scope.go:117] "RemoveContainer" containerID="3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b" Mar 12 17:09:48 crc kubenswrapper[5184]: E0312 17:09:48.255234 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b\": container with ID starting with 3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b not found: ID does not exist" containerID="3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.255265 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b"} err="failed to get container status \"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b\": rpc error: code = NotFound desc = could not find container \"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b\": container with ID starting with 3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.255286 5184 scope.go:117] "RemoveContainer" containerID="56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.255820 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3"} err="failed to get container status \"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3\": rpc error: code = NotFound desc = could not find container \"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3\": container with ID starting with 56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3 not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.255843 5184 scope.go:117] "RemoveContainer" containerID="7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.256090 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f"} err="failed to get container status \"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f\": rpc error: code = NotFound desc = could not find container \"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f\": container with ID starting with 7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.256112 5184 scope.go:117] "RemoveContainer" containerID="14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.256505 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871"} err="failed to get container status \"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871\": rpc error: code = NotFound desc = could not find container \"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871\": container with ID starting with 14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871 not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.256529 5184 scope.go:117] "RemoveContainer" containerID="3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.256825 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b"} err="failed to get container status \"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b\": rpc error: code = NotFound desc = could not find container \"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b\": container with ID starting with 3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.256844 5184 scope.go:117] "RemoveContainer" containerID="56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.257162 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3"} err="failed to get container status \"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3\": rpc error: code = NotFound desc = could not find container \"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3\": container with ID starting with 56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3 not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.257188 5184 scope.go:117] "RemoveContainer" containerID="7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.257820 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f"} err="failed to get container status \"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f\": rpc error: code = NotFound desc = could not find container \"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f\": container with ID starting with 7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.257842 5184 scope.go:117] "RemoveContainer" containerID="14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.258097 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871"} err="failed to get container status \"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871\": rpc error: code = NotFound desc = could not find container \"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871\": container with ID starting with 14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871 not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.258120 5184 scope.go:117] "RemoveContainer" containerID="3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.258529 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b"} err="failed to get container status \"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b\": rpc error: code = NotFound desc = could not find container \"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b\": container with ID starting with 3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.258548 5184 scope.go:117] "RemoveContainer" containerID="56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.259163 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3"} err="failed to get container status \"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3\": rpc error: code = NotFound desc = could not find container \"56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3\": container with ID starting with 56d22bbf3f40965e96cc675bf49be9c40d1da54c253da4173ad2f66807361ba3 not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.259201 5184 scope.go:117] "RemoveContainer" containerID="7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.259563 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f"} err="failed to get container status \"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f\": rpc error: code = NotFound desc = could not find container \"7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f\": container with ID starting with 7e8ed75aaa4e450fa247ecad91f692cf667a4835a190d87ae475c3e0ba34889f not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.259586 5184 scope.go:117] "RemoveContainer" containerID="14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.259932 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871"} err="failed to get container status \"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871\": rpc error: code = NotFound desc = could not find container \"14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871\": container with ID starting with 14a1738d1a6cbc36e1384e9a0986984c6ae139030e46a34a115e241fff5a5871 not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.259956 5184 scope.go:117] "RemoveContainer" containerID="3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.260540 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b"} err="failed to get container status \"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b\": rpc error: code = NotFound desc = could not find container \"3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b\": container with ID starting with 3d025cfaf1e78876a5645d2d39ea1566a6aed546e79685bf9fe57906b864011b not found: ID does not exist" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.311977 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzfxq\" (UniqueName: \"kubernetes.io/projected/592f368e-3812-49cf-9dbf-60c7b5d95a48-kube-api-access-bzfxq\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.312090 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-scripts\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.312169 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.312209 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.312286 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-log-httpd\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.312329 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-run-httpd\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.312351 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-config-data\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.312955 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-run-httpd\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.313027 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-log-httpd\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.317498 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-scripts\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.318017 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.318261 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.318815 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-config-data\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.332133 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzfxq\" (UniqueName: \"kubernetes.io/projected/592f368e-3812-49cf-9dbf-60c7b5d95a48-kube-api-access-bzfxq\") pod \"ceilometer-0\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " pod="openstack/ceilometer-0" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.432175 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc85f3c-6fc0-4331-bdbc-e457308457f8" path="/var/lib/kubelet/pods/ecc85f3c-6fc0-4331-bdbc-e457308457f8/volumes" Mar 12 17:09:48 crc kubenswrapper[5184]: I0312 17:09:48.496449 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.010078 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.017409 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"592f368e-3812-49cf-9dbf-60c7b5d95a48","Type":"ContainerStarted","Data":"a033c65dff19d3698c62121f1c05062807f602f5dc9e2aee29fdd72bc3e0d2c2"} Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.020147 5184 generic.go:358] "Generic (PLEG): container finished" podID="712022b6-f003-4d70-bb26-978c06c35480" containerID="7a5d9d7da184cc41d907b47d6a4ff6d494e923c3052137edfdca564769ae4879" exitCode=0 Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.020216 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" event={"ID":"712022b6-f003-4d70-bb26-978c06c35480","Type":"ContainerDied","Data":"7a5d9d7da184cc41d907b47d6a4ff6d494e923c3052137edfdca564769ae4879"} Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.430677 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-152a-account-create-update-qpqrz" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.495052 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mlrgv" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.537687 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hmwnz" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.540184 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7qwg\" (UniqueName: \"kubernetes.io/projected/c4e00da5-efff-455c-b1a5-63ce04f03c55-kube-api-access-z7qwg\") pod \"c4e00da5-efff-455c-b1a5-63ce04f03c55\" (UID: \"c4e00da5-efff-455c-b1a5-63ce04f03c55\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.540370 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-operator-scripts\") pod \"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef\" (UID: \"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.540428 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e00da5-efff-455c-b1a5-63ce04f03c55-operator-scripts\") pod \"c4e00da5-efff-455c-b1a5-63ce04f03c55\" (UID: \"c4e00da5-efff-455c-b1a5-63ce04f03c55\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.540497 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt67w\" (UniqueName: \"kubernetes.io/projected/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-kube-api-access-jt67w\") pod \"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef\" (UID: \"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.541074 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef" (UID: "1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.541988 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4e00da5-efff-455c-b1a5-63ce04f03c55-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4e00da5-efff-455c-b1a5-63ce04f03c55" (UID: "c4e00da5-efff-455c-b1a5-63ce04f03c55"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.549896 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e00da5-efff-455c-b1a5-63ce04f03c55-kube-api-access-z7qwg" (OuterVolumeSpecName: "kube-api-access-z7qwg") pod "c4e00da5-efff-455c-b1a5-63ce04f03c55" (UID: "c4e00da5-efff-455c-b1a5-63ce04f03c55"). InnerVolumeSpecName "kube-api-access-z7qwg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.554943 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-kube-api-access-jt67w" (OuterVolumeSpecName: "kube-api-access-jt67w") pod "1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef" (UID: "1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef"). InnerVolumeSpecName "kube-api-access-jt67w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.565725 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.574791 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n4crd" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.610961 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-26d7-account-create-update-ms2gh" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.668910 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ds2l\" (UniqueName: \"kubernetes.io/projected/712022b6-f003-4d70-bb26-978c06c35480-kube-api-access-4ds2l\") pod \"712022b6-f003-4d70-bb26-978c06c35480\" (UID: \"712022b6-f003-4d70-bb26-978c06c35480\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.669335 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4884f23-d147-46ea-a562-1a772dbd1c21-operator-scripts\") pod \"a4884f23-d147-46ea-a562-1a772dbd1c21\" (UID: \"a4884f23-d147-46ea-a562-1a772dbd1c21\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.669363 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clgwf\" (UniqueName: \"kubernetes.io/projected/a4884f23-d147-46ea-a562-1a772dbd1c21-kube-api-access-clgwf\") pod \"a4884f23-d147-46ea-a562-1a772dbd1c21\" (UID: \"a4884f23-d147-46ea-a562-1a772dbd1c21\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.669415 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-operator-scripts\") pod \"06c6acd8-4187-4ca0-ba38-0035df2f3d0c\" (UID: \"06c6acd8-4187-4ca0-ba38-0035df2f3d0c\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.669463 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/712022b6-f003-4d70-bb26-978c06c35480-operator-scripts\") pod \"712022b6-f003-4d70-bb26-978c06c35480\" (UID: \"712022b6-f003-4d70-bb26-978c06c35480\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.669519 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z4v9\" (UniqueName: \"kubernetes.io/projected/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-kube-api-access-9z4v9\") pod \"06c6acd8-4187-4ca0-ba38-0035df2f3d0c\" (UID: \"06c6acd8-4187-4ca0-ba38-0035df2f3d0c\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.669629 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk2jw\" (UniqueName: \"kubernetes.io/projected/504f9845-df1d-48a7-badf-ea8ed99ff8a5-kube-api-access-vk2jw\") pod \"504f9845-df1d-48a7-badf-ea8ed99ff8a5\" (UID: \"504f9845-df1d-48a7-badf-ea8ed99ff8a5\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.669670 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/504f9845-df1d-48a7-badf-ea8ed99ff8a5-operator-scripts\") pod \"504f9845-df1d-48a7-badf-ea8ed99ff8a5\" (UID: \"504f9845-df1d-48a7-badf-ea8ed99ff8a5\") " Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.670303 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e00da5-efff-455c-b1a5-63ce04f03c55-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.670328 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jt67w\" (UniqueName: \"kubernetes.io/projected/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-kube-api-access-jt67w\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.670339 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z7qwg\" (UniqueName: \"kubernetes.io/projected/c4e00da5-efff-455c-b1a5-63ce04f03c55-kube-api-access-z7qwg\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.670352 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.672299 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4884f23-d147-46ea-a562-1a772dbd1c21-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4884f23-d147-46ea-a562-1a772dbd1c21" (UID: "a4884f23-d147-46ea-a562-1a772dbd1c21"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.673546 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/504f9845-df1d-48a7-badf-ea8ed99ff8a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "504f9845-df1d-48a7-badf-ea8ed99ff8a5" (UID: "504f9845-df1d-48a7-badf-ea8ed99ff8a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.674074 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06c6acd8-4187-4ca0-ba38-0035df2f3d0c" (UID: "06c6acd8-4187-4ca0-ba38-0035df2f3d0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.677109 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/712022b6-f003-4d70-bb26-978c06c35480-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "712022b6-f003-4d70-bb26-978c06c35480" (UID: "712022b6-f003-4d70-bb26-978c06c35480"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.677575 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712022b6-f003-4d70-bb26-978c06c35480-kube-api-access-4ds2l" (OuterVolumeSpecName: "kube-api-access-4ds2l") pod "712022b6-f003-4d70-bb26-978c06c35480" (UID: "712022b6-f003-4d70-bb26-978c06c35480"). InnerVolumeSpecName "kube-api-access-4ds2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.678506 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-kube-api-access-9z4v9" (OuterVolumeSpecName: "kube-api-access-9z4v9") pod "06c6acd8-4187-4ca0-ba38-0035df2f3d0c" (UID: "06c6acd8-4187-4ca0-ba38-0035df2f3d0c"). InnerVolumeSpecName "kube-api-access-9z4v9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.679541 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/504f9845-df1d-48a7-badf-ea8ed99ff8a5-kube-api-access-vk2jw" (OuterVolumeSpecName: "kube-api-access-vk2jw") pod "504f9845-df1d-48a7-badf-ea8ed99ff8a5" (UID: "504f9845-df1d-48a7-badf-ea8ed99ff8a5"). InnerVolumeSpecName "kube-api-access-vk2jw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.682426 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4884f23-d147-46ea-a562-1a772dbd1c21-kube-api-access-clgwf" (OuterVolumeSpecName: "kube-api-access-clgwf") pod "a4884f23-d147-46ea-a562-1a772dbd1c21" (UID: "a4884f23-d147-46ea-a562-1a772dbd1c21"). InnerVolumeSpecName "kube-api-access-clgwf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:49 crc kubenswrapper[5184]: E0312 17:09:49.758184 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1870316 actualBytes=10240 Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.772135 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4ds2l\" (UniqueName: \"kubernetes.io/projected/712022b6-f003-4d70-bb26-978c06c35480-kube-api-access-4ds2l\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.772194 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4884f23-d147-46ea-a562-1a772dbd1c21-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.772206 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clgwf\" (UniqueName: \"kubernetes.io/projected/a4884f23-d147-46ea-a562-1a772dbd1c21-kube-api-access-clgwf\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.772218 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.772230 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/712022b6-f003-4d70-bb26-978c06c35480-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.772241 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9z4v9\" (UniqueName: \"kubernetes.io/projected/06c6acd8-4187-4ca0-ba38-0035df2f3d0c-kube-api-access-9z4v9\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.772253 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vk2jw\" (UniqueName: \"kubernetes.io/projected/504f9845-df1d-48a7-badf-ea8ed99ff8a5-kube-api-access-vk2jw\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.772264 5184 reconciler_common.go:299] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/504f9845-df1d-48a7-badf-ea8ed99ff8a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:49 crc kubenswrapper[5184]: I0312 17:09:49.835632 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.038981 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-26d7-account-create-update-ms2gh" event={"ID":"504f9845-df1d-48a7-badf-ea8ed99ff8a5","Type":"ContainerDied","Data":"adca65a45c0e8e85c40e5e0d658f5815de232c4fdf47463acfe7c19a97fce1d1"} Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.039026 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adca65a45c0e8e85c40e5e0d658f5815de232c4fdf47463acfe7c19a97fce1d1" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.039125 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-26d7-account-create-update-ms2gh" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.046319 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hmwnz" event={"ID":"a4884f23-d147-46ea-a562-1a772dbd1c21","Type":"ContainerDied","Data":"a9d34989e9aac211e813befc0020b796a763a7776fd6f9356a325dbea4fe8d30"} Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.046393 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9d34989e9aac211e813befc0020b796a763a7776fd6f9356a325dbea4fe8d30" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.046481 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hmwnz" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.052591 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.052606 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7e8d-account-create-update-bqsxc" event={"ID":"712022b6-f003-4d70-bb26-978c06c35480","Type":"ContainerDied","Data":"610479ae726d272fba41a2afe8d856ac3df44a4201ee7957f1d772b4fcda95e3"} Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.052635 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="610479ae726d272fba41a2afe8d856ac3df44a4201ee7957f1d772b4fcda95e3" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.055210 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-152a-account-create-update-qpqrz" event={"ID":"c4e00da5-efff-455c-b1a5-63ce04f03c55","Type":"ContainerDied","Data":"20488757fe24723bce322d782861efac8054faf32ae75b49ddeec5518f4340bd"} Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.055257 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20488757fe24723bce322d782861efac8054faf32ae75b49ddeec5518f4340bd" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.055310 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-152a-account-create-update-qpqrz" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.063881 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n4crd" event={"ID":"06c6acd8-4187-4ca0-ba38-0035df2f3d0c","Type":"ContainerDied","Data":"d7d2a4bc1482dc5f71b5f3ddb99c0987874a12d6747b811b33154e920994a870"} Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.063922 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7d2a4bc1482dc5f71b5f3ddb99c0987874a12d6747b811b33154e920994a870" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.064001 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n4crd" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.068148 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mlrgv" event={"ID":"1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef","Type":"ContainerDied","Data":"f015cab2057eb1f9fc90e343c596aad81f265ec5f633421029f31dd97557a6d8"} Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.068192 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f015cab2057eb1f9fc90e343c596aad81f265ec5f633421029f31dd97557a6d8" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.068287 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mlrgv" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.743016 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.743403 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.743472 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.744601 5184 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e97f86449204164890c97bdd96ba2e452210b3be2c7fc1815ab56658e4653bed"} pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 17:09:50 crc kubenswrapper[5184]: I0312 17:09:50.749427 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" containerID="cri-o://e97f86449204164890c97bdd96ba2e452210b3be2c7fc1815ab56658e4653bed" gracePeriod=600 Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.086972 5184 generic.go:358] "Generic (PLEG): container finished" podID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerID="e97f86449204164890c97bdd96ba2e452210b3be2c7fc1815ab56658e4653bed" exitCode=0 Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.087308 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerDied","Data":"e97f86449204164890c97bdd96ba2e452210b3be2c7fc1815ab56658e4653bed"} Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.087340 5184 scope.go:117] "RemoveContainer" containerID="3dd884d50ac06fbc873c6bc95140222a52ea0e09ed17b766f377daf94c2607fe" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.091704 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"592f368e-3812-49cf-9dbf-60c7b5d95a48","Type":"ContainerStarted","Data":"675268e2f75e910b61f528721aeb98093b0a8a3e2289f426f2296c782d7879ff"} Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.091752 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"592f368e-3812-49cf-9dbf-60c7b5d95a48","Type":"ContainerStarted","Data":"b0fd85e2dfb2f4f7d4749f448e7a2917ce93ca64c802fae077c8264f15f9874a"} Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.185911 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n22b8"] Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187247 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4e00da5-efff-455c-b1a5-63ce04f03c55" containerName="mariadb-account-create-update" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187277 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e00da5-efff-455c-b1a5-63ce04f03c55" containerName="mariadb-account-create-update" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187300 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="504f9845-df1d-48a7-badf-ea8ed99ff8a5" containerName="mariadb-account-create-update" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187309 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="504f9845-df1d-48a7-badf-ea8ed99ff8a5" containerName="mariadb-account-create-update" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187333 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="712022b6-f003-4d70-bb26-978c06c35480" containerName="mariadb-account-create-update" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187340 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="712022b6-f003-4d70-bb26-978c06c35480" containerName="mariadb-account-create-update" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187353 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06c6acd8-4187-4ca0-ba38-0035df2f3d0c" containerName="mariadb-database-create" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187361 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c6acd8-4187-4ca0-ba38-0035df2f3d0c" containerName="mariadb-database-create" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187407 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4884f23-d147-46ea-a562-1a772dbd1c21" containerName="mariadb-database-create" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187416 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4884f23-d147-46ea-a562-1a772dbd1c21" containerName="mariadb-database-create" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187437 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef" containerName="mariadb-database-create" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187445 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef" containerName="mariadb-database-create" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187657 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="06c6acd8-4187-4ca0-ba38-0035df2f3d0c" containerName="mariadb-database-create" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187676 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="712022b6-f003-4d70-bb26-978c06c35480" containerName="mariadb-account-create-update" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187691 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4e00da5-efff-455c-b1a5-63ce04f03c55" containerName="mariadb-account-create-update" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187707 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef" containerName="mariadb-database-create" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187724 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4884f23-d147-46ea-a562-1a772dbd1c21" containerName="mariadb-database-create" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.187737 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="504f9845-df1d-48a7-badf-ea8ed99ff8a5" containerName="mariadb-account-create-update" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.192088 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.194884 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n22b8"] Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.196932 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-scripts\"" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.197147 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-config-data\"" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.197353 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-nova-dockercfg-lwlqf\"" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.306548 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-scripts\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.306608 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.306658 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhdcg\" (UniqueName: \"kubernetes.io/projected/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-kube-api-access-dhdcg\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.306921 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-config-data\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.409256 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhdcg\" (UniqueName: \"kubernetes.io/projected/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-kube-api-access-dhdcg\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.409365 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-config-data\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.409536 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-scripts\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.409589 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.416764 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.416868 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-config-data\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.416907 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-scripts\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.430560 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhdcg\" (UniqueName: \"kubernetes.io/projected/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-kube-api-access-dhdcg\") pod \"nova-cell0-conductor-db-sync-n22b8\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.531254 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.927052 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:51 crc kubenswrapper[5184]: I0312 17:09:51.934950 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-756587dd69-bfms9" Mar 12 17:09:52 crc kubenswrapper[5184]: I0312 17:09:52.116929 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"42ed46ee5dbf0d27675a5969e00cdc1d30283a154524122596ea10898f42720f"} Mar 12 17:09:52 crc kubenswrapper[5184]: I0312 17:09:52.122740 5184 generic.go:358] "Generic (PLEG): container finished" podID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerID="7ec0a142a102ad955ebc109ed409ce67b8b6a9fd3b5631ad22fc17d1ab354718" exitCode=137 Mar 12 17:09:52 crc kubenswrapper[5184]: I0312 17:09:52.122894 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"246d17d3-b07a-4fe4-8165-711bcd72517f","Type":"ContainerDied","Data":"7ec0a142a102ad955ebc109ed409ce67b8b6a9fd3b5631ad22fc17d1ab354718"} Mar 12 17:09:52 crc kubenswrapper[5184]: I0312 17:09:52.628714 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:09:52 crc kubenswrapper[5184]: I0312 17:09:52.628983 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" containerName="glance-log" containerID="cri-o://e5d8d20981f57a472f11e6505aa691fbe2ff3ee1edb7636736bd11c49d37d8b1" gracePeriod=30 Mar 12 17:09:52 crc kubenswrapper[5184]: I0312 17:09:52.629083 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" containerName="glance-httpd" containerID="cri-o://dbb55fb8a743e8dff6d965deea47c49db6ab448f11a8c0b6d42c46024a99beaa" gracePeriod=30 Mar 12 17:09:53 crc kubenswrapper[5184]: I0312 17:09:53.134046 5184 generic.go:358] "Generic (PLEG): container finished" podID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" containerID="e5d8d20981f57a472f11e6505aa691fbe2ff3ee1edb7636736bd11c49d37d8b1" exitCode=143 Mar 12 17:09:53 crc kubenswrapper[5184]: I0312 17:09:53.134152 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4e21b41b-8457-4bae-b2f8-fd29ea43334a","Type":"ContainerDied","Data":"e5d8d20981f57a472f11e6505aa691fbe2ff3ee1edb7636736bd11c49d37d8b1"} Mar 12 17:09:53 crc kubenswrapper[5184]: I0312 17:09:53.137015 5184 generic.go:358] "Generic (PLEG): container finished" podID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerID="881e288299526eeda5e8bb60032448ab57c594785adad11983b46f43c6e06ae3" exitCode=137 Mar 12 17:09:53 crc kubenswrapper[5184]: I0312 17:09:53.137092 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859ddbd78-2m2xk" event={"ID":"ccf562d2-6ce1-4eb6-b27e-679493ce3870","Type":"ContainerDied","Data":"881e288299526eeda5e8bb60032448ab57c594785adad11983b46f43c6e06ae3"} Mar 12 17:09:54 crc kubenswrapper[5184]: I0312 17:09:54.761209 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:09:54 crc kubenswrapper[5184]: I0312 17:09:54.761974 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" containerName="glance-log" containerID="cri-o://18266ffb33d0bae79da2c0ef8d203cdbb98ba1e2876ad73a5200b2db877237a4" gracePeriod=30 Mar 12 17:09:54 crc kubenswrapper[5184]: I0312 17:09:54.762083 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" containerName="glance-httpd" containerID="cri-o://45116000968cef2b42520a056c24889472f5edb9976e94851ac0e2090c7465e3" gracePeriod=30 Mar 12 17:09:54 crc kubenswrapper[5184]: I0312 17:09:54.898242 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:09:55 crc kubenswrapper[5184]: I0312 17:09:55.161744 5184 generic.go:358] "Generic (PLEG): container finished" podID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" containerID="18266ffb33d0bae79da2c0ef8d203cdbb98ba1e2876ad73a5200b2db877237a4" exitCode=143 Mar 12 17:09:55 crc kubenswrapper[5184]: I0312 17:09:55.161792 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0","Type":"ContainerDied","Data":"18266ffb33d0bae79da2c0ef8d203cdbb98ba1e2876ad73a5200b2db877237a4"} Mar 12 17:09:56 crc kubenswrapper[5184]: I0312 17:09:56.175496 5184 generic.go:358] "Generic (PLEG): container finished" podID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" containerID="dbb55fb8a743e8dff6d965deea47c49db6ab448f11a8c0b6d42c46024a99beaa" exitCode=0 Mar 12 17:09:56 crc kubenswrapper[5184]: I0312 17:09:56.175716 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4e21b41b-8457-4bae-b2f8-fd29ea43334a","Type":"ContainerDied","Data":"dbb55fb8a743e8dff6d965deea47c49db6ab448f11a8c0b6d42c46024a99beaa"} Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.833335 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.853080 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcnq2\" (UniqueName: \"kubernetes.io/projected/246d17d3-b07a-4fe4-8165-711bcd72517f-kube-api-access-rcnq2\") pod \"246d17d3-b07a-4fe4-8165-711bcd72517f\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.853188 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/246d17d3-b07a-4fe4-8165-711bcd72517f-etc-machine-id\") pod \"246d17d3-b07a-4fe4-8165-711bcd72517f\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.853274 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data-custom\") pod \"246d17d3-b07a-4fe4-8165-711bcd72517f\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.853331 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-scripts\") pod \"246d17d3-b07a-4fe4-8165-711bcd72517f\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.853477 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246d17d3-b07a-4fe4-8165-711bcd72517f-logs\") pod \"246d17d3-b07a-4fe4-8165-711bcd72517f\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.853534 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data\") pod \"246d17d3-b07a-4fe4-8165-711bcd72517f\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.853581 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-combined-ca-bundle\") pod \"246d17d3-b07a-4fe4-8165-711bcd72517f\" (UID: \"246d17d3-b07a-4fe4-8165-711bcd72517f\") " Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.854635 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/246d17d3-b07a-4fe4-8165-711bcd72517f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "246d17d3-b07a-4fe4-8165-711bcd72517f" (UID: "246d17d3-b07a-4fe4-8165-711bcd72517f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.855254 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/246d17d3-b07a-4fe4-8165-711bcd72517f-logs" (OuterVolumeSpecName: "logs") pod "246d17d3-b07a-4fe4-8165-711bcd72517f" (UID: "246d17d3-b07a-4fe4-8165-711bcd72517f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.867650 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-scripts" (OuterVolumeSpecName: "scripts") pod "246d17d3-b07a-4fe4-8165-711bcd72517f" (UID: "246d17d3-b07a-4fe4-8165-711bcd72517f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.871727 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/246d17d3-b07a-4fe4-8165-711bcd72517f-kube-api-access-rcnq2" (OuterVolumeSpecName: "kube-api-access-rcnq2") pod "246d17d3-b07a-4fe4-8165-711bcd72517f" (UID: "246d17d3-b07a-4fe4-8165-711bcd72517f"). InnerVolumeSpecName "kube-api-access-rcnq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.885118 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "246d17d3-b07a-4fe4-8165-711bcd72517f" (UID: "246d17d3-b07a-4fe4-8165-711bcd72517f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.921406 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "246d17d3-b07a-4fe4-8165-711bcd72517f" (UID: "246d17d3-b07a-4fe4-8165-711bcd72517f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.939653 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data" (OuterVolumeSpecName: "config-data") pod "246d17d3-b07a-4fe4-8165-711bcd72517f" (UID: "246d17d3-b07a-4fe4-8165-711bcd72517f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.955742 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rcnq2\" (UniqueName: \"kubernetes.io/projected/246d17d3-b07a-4fe4-8165-711bcd72517f-kube-api-access-rcnq2\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.955789 5184 reconciler_common.go:299] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/246d17d3-b07a-4fe4-8165-711bcd72517f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.955803 5184 reconciler_common.go:299] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.955813 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.955824 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/246d17d3-b07a-4fe4-8165-711bcd72517f-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.955835 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:57 crc kubenswrapper[5184]: I0312 17:09:57.955847 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/246d17d3-b07a-4fe4-8165-711bcd72517f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.130755 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.159621 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-combined-ca-bundle\") pod \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.159673 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-config-data\") pod \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.160246 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-secret-key\") pod \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.160322 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t49p\" (UniqueName: \"kubernetes.io/projected/ccf562d2-6ce1-4eb6-b27e-679493ce3870-kube-api-access-7t49p\") pod \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.160409 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-tls-certs\") pod \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.160899 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf562d2-6ce1-4eb6-b27e-679493ce3870-logs\") pod \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.160986 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-scripts\") pod \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\" (UID: \"ccf562d2-6ce1-4eb6-b27e-679493ce3870\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.163385 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccf562d2-6ce1-4eb6-b27e-679493ce3870-logs" (OuterVolumeSpecName: "logs") pod "ccf562d2-6ce1-4eb6-b27e-679493ce3870" (UID: "ccf562d2-6ce1-4eb6-b27e-679493ce3870"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.165582 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ccf562d2-6ce1-4eb6-b27e-679493ce3870" (UID: "ccf562d2-6ce1-4eb6-b27e-679493ce3870"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.182642 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf562d2-6ce1-4eb6-b27e-679493ce3870-kube-api-access-7t49p" (OuterVolumeSpecName: "kube-api-access-7t49p") pod "ccf562d2-6ce1-4eb6-b27e-679493ce3870" (UID: "ccf562d2-6ce1-4eb6-b27e-679493ce3870"). InnerVolumeSpecName "kube-api-access-7t49p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.199366 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-scripts" (OuterVolumeSpecName: "scripts") pod "ccf562d2-6ce1-4eb6-b27e-679493ce3870" (UID: "ccf562d2-6ce1-4eb6-b27e-679493ce3870"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.212485 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"246d17d3-b07a-4fe4-8165-711bcd72517f","Type":"ContainerDied","Data":"c2bc4fa4369b96c3b7ec978f87793e86b1dce86caad031faa407b88f5f1f60f3"} Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.212550 5184 scope.go:117] "RemoveContainer" containerID="7ec0a142a102ad955ebc109ed409ce67b8b6a9fd3b5631ad22fc17d1ab354718" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.212716 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.212836 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccf562d2-6ce1-4eb6-b27e-679493ce3870" (UID: "ccf562d2-6ce1-4eb6-b27e-679493ce3870"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.222023 5184 generic.go:358] "Generic (PLEG): container finished" podID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" containerID="45116000968cef2b42520a056c24889472f5edb9976e94851ac0e2090c7465e3" exitCode=0 Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.222189 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0","Type":"ContainerDied","Data":"45116000968cef2b42520a056c24889472f5edb9976e94851ac0e2090c7465e3"} Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.224255 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-config-data" (OuterVolumeSpecName: "config-data") pod "ccf562d2-6ce1-4eb6-b27e-679493ce3870" (UID: "ccf562d2-6ce1-4eb6-b27e-679493ce3870"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.224697 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-859ddbd78-2m2xk" event={"ID":"ccf562d2-6ce1-4eb6-b27e-679493ce3870","Type":"ContainerDied","Data":"6ae4351238b6fdee8112497a216a6ee7335c267ada156a2017b6669139243873"} Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.224775 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-859ddbd78-2m2xk" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.235169 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.270825 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ccf562d2-6ce1-4eb6-b27e-679493ce3870" (UID: "ccf562d2-6ce1-4eb6-b27e-679493ce3870"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.275605 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.275654 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.275663 5184 reconciler_common.go:299] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.275675 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7t49p\" (UniqueName: \"kubernetes.io/projected/ccf562d2-6ce1-4eb6-b27e-679493ce3870-kube-api-access-7t49p\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.275685 5184 reconciler_common.go:299] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccf562d2-6ce1-4eb6-b27e-679493ce3870-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.275693 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf562d2-6ce1-4eb6-b27e-679493ce3870-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.275701 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccf562d2-6ce1-4eb6-b27e-679493ce3870-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.313273 5184 scope.go:117] "RemoveContainer" containerID="5fdf1d7ebac6c679e3f972505eb5eb9e67aa6da80703bae62de7ecf931c1400d" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.375864 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.376407 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhb6t\" (UniqueName: \"kubernetes.io/projected/4e21b41b-8457-4bae-b2f8-fd29ea43334a-kube-api-access-jhb6t\") pod \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.376497 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-config-data\") pod \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.376682 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-scripts\") pod \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.376727 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-httpd-run\") pod \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.376766 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-logs\") pod \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.376834 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-combined-ca-bundle\") pod \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.376873 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.376900 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-public-tls-certs\") pod \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\" (UID: \"4e21b41b-8457-4bae-b2f8-fd29ea43334a\") " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.377788 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4e21b41b-8457-4bae-b2f8-fd29ea43334a" (UID: "4e21b41b-8457-4bae-b2f8-fd29ea43334a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.381210 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-logs" (OuterVolumeSpecName: "logs") pod "4e21b41b-8457-4bae-b2f8-fd29ea43334a" (UID: "4e21b41b-8457-4bae-b2f8-fd29ea43334a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.393237 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e21b41b-8457-4bae-b2f8-fd29ea43334a-kube-api-access-jhb6t" (OuterVolumeSpecName: "kube-api-access-jhb6t") pod "4e21b41b-8457-4bae-b2f8-fd29ea43334a" (UID: "4e21b41b-8457-4bae-b2f8-fd29ea43334a"). InnerVolumeSpecName "kube-api-access-jhb6t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.397773 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "4e21b41b-8457-4bae-b2f8-fd29ea43334a" (UID: "4e21b41b-8457-4bae-b2f8-fd29ea43334a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.420388 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-scripts" (OuterVolumeSpecName: "scripts") pod "4e21b41b-8457-4bae-b2f8-fd29ea43334a" (UID: "4e21b41b-8457-4bae-b2f8-fd29ea43334a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.469275 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.480371 5184 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.480498 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhb6t\" (UniqueName: \"kubernetes.io/projected/4e21b41b-8457-4bae-b2f8-fd29ea43334a-kube-api-access-jhb6t\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.480509 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.480517 5184 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.480527 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e21b41b-8457-4bae-b2f8-fd29ea43334a-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.501806 5184 scope.go:117] "RemoveContainer" containerID="1b15bf9717411285614dfab9d07e6784fa5f11a28ed47e3b1c3e31c203d181c9" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.519399 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524264 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerName="cinder-api" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524394 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerName="cinder-api" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524413 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" containerName="glance-log" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524418 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" containerName="glance-log" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524547 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerName="horizon" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524554 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerName="horizon" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524570 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerName="cinder-api-log" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524576 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerName="cinder-api-log" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524585 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerName="horizon-log" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524590 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerName="horizon-log" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524705 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" containerName="glance-httpd" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.524711 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" containerName="glance-httpd" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.525498 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e21b41b-8457-4bae-b2f8-fd29ea43334a" (UID: "4e21b41b-8457-4bae-b2f8-fd29ea43334a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.526199 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" containerName="glance-httpd" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.526215 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerName="horizon-log" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.526223 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerName="cinder-api" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.526357 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" containerName="horizon" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.526368 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" containerName="glance-log" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.526405 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerName="cinder-api-log" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.604438 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.616900 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.617780 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.621912 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-cinder-internal-svc\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.624236 5184 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.625805 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-cinder-public-svc\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.629010 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cinder-api-config-data\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.714810 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.715061 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-config-data\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.715205 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.715371 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.715586 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66htm\" (UniqueName: \"kubernetes.io/projected/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-kube-api-access-66htm\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.715740 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-scripts\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.716114 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.716209 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-logs\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.716290 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.716470 5184 reconciler_common.go:299] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.730675 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-config-data" (OuterVolumeSpecName: "config-data") pod "4e21b41b-8457-4bae-b2f8-fd29ea43334a" (UID: "4e21b41b-8457-4bae-b2f8-fd29ea43334a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.730709 5184 scope.go:117] "RemoveContainer" containerID="881e288299526eeda5e8bb60032448ab57c594785adad11983b46f43c6e06ae3" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.730741 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4e21b41b-8457-4bae-b2f8-fd29ea43334a" (UID: "4e21b41b-8457-4bae-b2f8-fd29ea43334a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.746256 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/horizon-859ddbd78-2m2xk"] Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.753339 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-859ddbd78-2m2xk"] Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.818284 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-scripts\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.818561 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.818675 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-logs\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.818757 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.818882 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.818969 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-config-data\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.819060 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.819162 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.819288 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66htm\" (UniqueName: \"kubernetes.io/projected/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-kube-api-access-66htm\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.819428 5184 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.819502 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e21b41b-8457-4bae-b2f8-fd29ea43334a-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.824966 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.825290 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-logs\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.841616 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.848408 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.848709 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.850294 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.854423 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-scripts\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.856862 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-config-data\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.862117 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66htm\" (UniqueName: \"kubernetes.io/projected/c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb-kube-api-access-66htm\") pod \"cinder-api-0\" (UID: \"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb\") " pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: W0312 17:09:58.896815 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a7a82d0_151a_40b3_86b4_79aff3a3b0be.slice/crio-d426490afa44bf22b0150a67e9f648e0151f34760602c63db56389dd83df25b9 WatchSource:0}: Error finding container d426490afa44bf22b0150a67e9f648e0151f34760602c63db56389dd83df25b9: Status 404 returned error can't find the container with id d426490afa44bf22b0150a67e9f648e0151f34760602c63db56389dd83df25b9 Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.901146 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n22b8"] Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.938702 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 17:09:58 crc kubenswrapper[5184]: I0312 17:09:58.940737 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.021914 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.022256 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-config-data\") pod \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.022433 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-httpd-run\") pod \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.022508 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-logs\") pod \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.022609 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-combined-ca-bundle\") pod \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.022741 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-internal-tls-certs\") pod \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.022843 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh2fs\" (UniqueName: \"kubernetes.io/projected/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-kube-api-access-nh2fs\") pod \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.022941 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-scripts\") pod \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\" (UID: \"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0\") " Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.023212 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" (UID: "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.023596 5184 reconciler_common.go:299] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.023595 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-logs" (OuterVolumeSpecName: "logs") pod "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" (UID: "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.026955 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-kube-api-access-nh2fs" (OuterVolumeSpecName: "kube-api-access-nh2fs") pod "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" (UID: "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0"). InnerVolumeSpecName "kube-api-access-nh2fs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.029649 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-scripts" (OuterVolumeSpecName: "scripts") pod "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" (UID: "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.029953 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" (UID: "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.087069 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" (UID: "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.102581 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-config-data" (OuterVolumeSpecName: "config-data") pod "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" (UID: "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.102649 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" (UID: "2803560e-0c2e-4d2f-9e3a-76fe1cc629c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.124947 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.125006 5184 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.125020 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.125029 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.125037 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.125047 5184 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.125055 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nh2fs\" (UniqueName: \"kubernetes.io/projected/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0-kube-api-access-nh2fs\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.149912 5184 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.226191 5184 reconciler_common.go:299] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.233751 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"592f368e-3812-49cf-9dbf-60c7b5d95a48","Type":"ContainerStarted","Data":"deeba073fb066d813322cf3b6cf33314631f25b69c9378a10b547890618d0336"} Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.236716 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.236735 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2803560e-0c2e-4d2f-9e3a-76fe1cc629c0","Type":"ContainerDied","Data":"43c61f4aefd4d16bc7578ab5dbebf27ad6a77294d523115e63431684d035e390"} Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.236772 5184 scope.go:117] "RemoveContainer" containerID="45116000968cef2b42520a056c24889472f5edb9976e94851ac0e2090c7465e3" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.243555 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4e21b41b-8457-4bae-b2f8-fd29ea43334a","Type":"ContainerDied","Data":"02a52ccd6bdc6a772b9a1149b471146571553e9f7d66b1c58e6f9e7eb5a1d43c"} Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.243758 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.248416 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n22b8" event={"ID":"2a7a82d0-151a-40b3-86b4-79aff3a3b0be","Type":"ContainerStarted","Data":"d426490afa44bf22b0150a67e9f648e0151f34760602c63db56389dd83df25b9"} Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.250935 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3c0d941e-36d1-4112-8488-a27d08ec0a8b","Type":"ContainerStarted","Data":"394df7f53d2c7f560c7d619b9fcffee3e57faecd1c40b8aea9095adca1ee471c"} Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.260621 5184 scope.go:117] "RemoveContainer" containerID="18266ffb33d0bae79da2c0ef8d203cdbb98ba1e2876ad73a5200b2db877237a4" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.279621 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.055642112 podStartE2EDuration="16.2795947s" podCreationTimestamp="2026-03-12 17:09:43 +0000 UTC" firstStartedPulling="2026-03-12 17:09:43.946449536 +0000 UTC m=+1126.487760875" lastFinishedPulling="2026-03-12 17:09:58.170402124 +0000 UTC m=+1140.711713463" observedRunningTime="2026-03-12 17:09:59.268714528 +0000 UTC m=+1141.810025877" watchObservedRunningTime="2026-03-12 17:09:59.2795947 +0000 UTC m=+1141.820906039" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.296311 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.301952 5184 scope.go:117] "RemoveContainer" containerID="dbb55fb8a743e8dff6d965deea47c49db6ab448f11a8c0b6d42c46024a99beaa" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.304546 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.314570 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.323164 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.324825 5184 scope.go:117] "RemoveContainer" containerID="e5d8d20981f57a472f11e6505aa691fbe2ff3ee1edb7636736bd11c49d37d8b1" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.333116 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.334117 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" containerName="glance-httpd" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.334136 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" containerName="glance-httpd" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.334187 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" containerName="glance-log" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.334194 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" containerName="glance-log" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.334387 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" containerName="glance-log" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.334402 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" containerName="glance-httpd" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.344148 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.348615 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-scripts\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.348654 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-glance-default-internal-svc\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.348742 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-internal-config-data\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.349012 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-glance-dockercfg-kvq4j\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.369445 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.378306 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.384978 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"glance-default-external-config-data\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.385434 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-glance-default-public-svc\"" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.409190 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.427083 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.428907 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429027 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429078 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429144 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429164 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429187 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429264 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79cef421-d05c-4274-8c92-337f4b818bff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429285 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429326 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79cef421-d05c-4274-8c92-337f4b818bff-logs\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429390 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429466 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4rg7\" (UniqueName: \"kubernetes.io/projected/6f0e3724-623c-4453-8b13-623e9daf508d-kube-api-access-k4rg7\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429545 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-scripts\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429577 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f0e3724-623c-4453-8b13-623e9daf508d-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429605 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-config-data\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429633 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rznrw\" (UniqueName: \"kubernetes.io/projected/79cef421-d05c-4274-8c92-337f4b818bff-kube-api-access-rznrw\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.429660 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f0e3724-623c-4453-8b13-623e9daf508d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.458306 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.530914 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.530957 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.530979 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531016 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79cef421-d05c-4274-8c92-337f4b818bff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531035 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531056 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79cef421-d05c-4274-8c92-337f4b818bff-logs\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531084 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531118 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4rg7\" (UniqueName: \"kubernetes.io/projected/6f0e3724-623c-4453-8b13-623e9daf508d-kube-api-access-k4rg7\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531155 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-scripts\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531178 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f0e3724-623c-4453-8b13-623e9daf508d-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531198 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-config-data\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531218 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rznrw\" (UniqueName: \"kubernetes.io/projected/79cef421-d05c-4274-8c92-337f4b818bff-kube-api-access-rznrw\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531240 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f0e3724-623c-4453-8b13-623e9daf508d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531284 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531316 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.531345 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.534045 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79cef421-d05c-4274-8c92-337f4b818bff-logs\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.535139 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/79cef421-d05c-4274-8c92-337f4b818bff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.535476 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.536157 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.536648 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.536666 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.537004 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f0e3724-623c-4453-8b13-623e9daf508d-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.537330 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.537603 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.538211 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.546098 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-scripts\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.550908 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f0e3724-623c-4453-8b13-623e9daf508d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.552072 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79cef421-d05c-4274-8c92-337f4b818bff-config-data\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.552073 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f0e3724-623c-4453-8b13-623e9daf508d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.555901 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rznrw\" (UniqueName: \"kubernetes.io/projected/79cef421-d05c-4274-8c92-337f4b818bff-kube-api-access-rznrw\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.576537 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4rg7\" (UniqueName: \"kubernetes.io/projected/6f0e3724-623c-4453-8b13-623e9daf508d-kube-api-access-k4rg7\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.588174 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"79cef421-d05c-4274-8c92-337f4b818bff\") " pod="openstack/glance-default-external-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.588684 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6f0e3724-623c-4453-8b13-623e9daf508d\") " pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.688994 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 17:09:59 crc kubenswrapper[5184]: I0312 17:09:59.702765 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.147831 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555590-r6pk7"] Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.170220 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555590-r6pk7"] Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.170327 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555590-r6pk7" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.179547 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.179692 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.179914 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.267527 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq5sf\" (UniqueName: \"kubernetes.io/projected/65b7c2ee-47aa-47cb-9360-432c7da6513c-kube-api-access-bq5sf\") pod \"auto-csr-approver-29555590-r6pk7\" (UID: \"65b7c2ee-47aa-47cb-9360-432c7da6513c\") " pod="openshift-infra/auto-csr-approver-29555590-r6pk7" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.319499 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.328927 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb","Type":"ContainerStarted","Data":"3c0070e8d576cc18b4ad2b8550fe281262919a08ecd400c445db688a97717828"} Mar 12 17:10:00 crc kubenswrapper[5184]: W0312 17:10:00.332090 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cef421_d05c_4274_8c92_337f4b818bff.slice/crio-76f7c67006741ef89572f1ad2ebd7983c7c8711935dd9cc4c1a3e7cdcf1872c9 WatchSource:0}: Error finding container 76f7c67006741ef89572f1ad2ebd7983c7c8711935dd9cc4c1a3e7cdcf1872c9: Status 404 returned error can't find the container with id 76f7c67006741ef89572f1ad2ebd7983c7c8711935dd9cc4c1a3e7cdcf1872c9 Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.370518 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq5sf\" (UniqueName: \"kubernetes.io/projected/65b7c2ee-47aa-47cb-9360-432c7da6513c-kube-api-access-bq5sf\") pod \"auto-csr-approver-29555590-r6pk7\" (UID: \"65b7c2ee-47aa-47cb-9360-432c7da6513c\") " pod="openshift-infra/auto-csr-approver-29555590-r6pk7" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.390436 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq5sf\" (UniqueName: \"kubernetes.io/projected/65b7c2ee-47aa-47cb-9360-432c7da6513c-kube-api-access-bq5sf\") pod \"auto-csr-approver-29555590-r6pk7\" (UID: \"65b7c2ee-47aa-47cb-9360-432c7da6513c\") " pod="openshift-infra/auto-csr-approver-29555590-r6pk7" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.422261 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="246d17d3-b07a-4fe4-8165-711bcd72517f" path="/var/lib/kubelet/pods/246d17d3-b07a-4fe4-8165-711bcd72517f/volumes" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.423295 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2803560e-0c2e-4d2f-9e3a-76fe1cc629c0" path="/var/lib/kubelet/pods/2803560e-0c2e-4d2f-9e3a-76fe1cc629c0/volumes" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.424068 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e21b41b-8457-4bae-b2f8-fd29ea43334a" path="/var/lib/kubelet/pods/4e21b41b-8457-4bae-b2f8-fd29ea43334a/volumes" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.425843 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccf562d2-6ce1-4eb6-b27e-679493ce3870" path="/var/lib/kubelet/pods/ccf562d2-6ce1-4eb6-b27e-679493ce3870/volumes" Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.426676 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 17:10:00 crc kubenswrapper[5184]: I0312 17:10:00.500723 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555590-r6pk7" Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.031962 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555590-r6pk7"] Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.240538 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="246d17d3-b07a-4fe4-8165-711bcd72517f" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.167:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.348337 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555590-r6pk7" event={"ID":"65b7c2ee-47aa-47cb-9360-432c7da6513c","Type":"ContainerStarted","Data":"ebac5a4370087f35b0ad3d9ddc765d407b088b9c749b2202a3e3441d612ca35a"} Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.365798 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f0e3724-623c-4453-8b13-623e9daf508d","Type":"ContainerStarted","Data":"67b93bd475f7f101a175af406544bda1bd340e886fda837569a678782e9e7b33"} Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.374006 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"592f368e-3812-49cf-9dbf-60c7b5d95a48","Type":"ContainerStarted","Data":"4687c868dcc72a507bf2dca1e7168c07e98842bc213c55d7e35d62194545da54"} Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.374156 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="ceilometer-central-agent" containerID="cri-o://b0fd85e2dfb2f4f7d4749f448e7a2917ce93ca64c802fae077c8264f15f9874a" gracePeriod=30 Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.374179 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.374304 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="proxy-httpd" containerID="cri-o://4687c868dcc72a507bf2dca1e7168c07e98842bc213c55d7e35d62194545da54" gracePeriod=30 Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.374343 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="sg-core" containerID="cri-o://deeba073fb066d813322cf3b6cf33314631f25b69c9378a10b547890618d0336" gracePeriod=30 Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.374406 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="ceilometer-notification-agent" containerID="cri-o://675268e2f75e910b61f528721aeb98093b0a8a3e2289f426f2296c782d7879ff" gracePeriod=30 Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.390279 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"79cef421-d05c-4274-8c92-337f4b818bff","Type":"ContainerStarted","Data":"76f7c67006741ef89572f1ad2ebd7983c7c8711935dd9cc4c1a3e7cdcf1872c9"} Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.681560 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7949fc945c-bns65" Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.714744 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.066959401 podStartE2EDuration="13.714705859s" podCreationTimestamp="2026-03-12 17:09:48 +0000 UTC" firstStartedPulling="2026-03-12 17:09:48.997685138 +0000 UTC m=+1131.538996497" lastFinishedPulling="2026-03-12 17:10:00.645431616 +0000 UTC m=+1143.186742955" observedRunningTime="2026-03-12 17:10:01.398952402 +0000 UTC m=+1143.940263741" watchObservedRunningTime="2026-03-12 17:10:01.714705859 +0000 UTC m=+1144.256017198" Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.758397 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-869b7dc84d-67g2c"] Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.760009 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/neutron-869b7dc84d-67g2c" podUID="5c4af3c8-3189-41d2-9709-336561190b17" containerName="neutron-api" containerID="cri-o://98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb" gracePeriod=30 Mar 12 17:10:01 crc kubenswrapper[5184]: I0312 17:10:01.760477 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/neutron-869b7dc84d-67g2c" podUID="5c4af3c8-3189-41d2-9709-336561190b17" containerName="neutron-httpd" containerID="cri-o://2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d" gracePeriod=30 Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.430603 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f0e3724-623c-4453-8b13-623e9daf508d","Type":"ContainerStarted","Data":"54a4305097073b5debfeadd4a9be06f73ca1001b45c7adbe21a7579aa315e623"} Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.436690 5184 generic.go:358] "Generic (PLEG): container finished" podID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerID="4687c868dcc72a507bf2dca1e7168c07e98842bc213c55d7e35d62194545da54" exitCode=0 Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.436718 5184 generic.go:358] "Generic (PLEG): container finished" podID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerID="deeba073fb066d813322cf3b6cf33314631f25b69c9378a10b547890618d0336" exitCode=2 Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.436725 5184 generic.go:358] "Generic (PLEG): container finished" podID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerID="675268e2f75e910b61f528721aeb98093b0a8a3e2289f426f2296c782d7879ff" exitCode=0 Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.436732 5184 generic.go:358] "Generic (PLEG): container finished" podID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerID="b0fd85e2dfb2f4f7d4749f448e7a2917ce93ca64c802fae077c8264f15f9874a" exitCode=0 Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.436902 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"592f368e-3812-49cf-9dbf-60c7b5d95a48","Type":"ContainerDied","Data":"4687c868dcc72a507bf2dca1e7168c07e98842bc213c55d7e35d62194545da54"} Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.436924 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"592f368e-3812-49cf-9dbf-60c7b5d95a48","Type":"ContainerDied","Data":"deeba073fb066d813322cf3b6cf33314631f25b69c9378a10b547890618d0336"} Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.436934 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"592f368e-3812-49cf-9dbf-60c7b5d95a48","Type":"ContainerDied","Data":"675268e2f75e910b61f528721aeb98093b0a8a3e2289f426f2296c782d7879ff"} Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.436942 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"592f368e-3812-49cf-9dbf-60c7b5d95a48","Type":"ContainerDied","Data":"b0fd85e2dfb2f4f7d4749f448e7a2917ce93ca64c802fae077c8264f15f9874a"} Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.444919 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb","Type":"ContainerStarted","Data":"da80a68f0e46f76a4e6ec7c3e6d564d779f8b26f5a1430c07045d56a4d6c9c5f"} Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.448087 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.452361 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"79cef421-d05c-4274-8c92-337f4b818bff","Type":"ContainerStarted","Data":"97dc1694f31069f65ea8b4d3f77cb2565e6810a117a88933636cec51865c2a6a"} Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.456729 5184 generic.go:358] "Generic (PLEG): container finished" podID="5c4af3c8-3189-41d2-9709-336561190b17" containerID="2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d" exitCode=0 Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.456770 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869b7dc84d-67g2c" event={"ID":"5c4af3c8-3189-41d2-9709-336561190b17","Type":"ContainerDied","Data":"2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d"} Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.525978 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-scripts\") pod \"592f368e-3812-49cf-9dbf-60c7b5d95a48\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.526076 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-log-httpd\") pod \"592f368e-3812-49cf-9dbf-60c7b5d95a48\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.526106 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-run-httpd\") pod \"592f368e-3812-49cf-9dbf-60c7b5d95a48\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.526137 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-config-data\") pod \"592f368e-3812-49cf-9dbf-60c7b5d95a48\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.526165 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzfxq\" (UniqueName: \"kubernetes.io/projected/592f368e-3812-49cf-9dbf-60c7b5d95a48-kube-api-access-bzfxq\") pod \"592f368e-3812-49cf-9dbf-60c7b5d95a48\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.526337 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-combined-ca-bundle\") pod \"592f368e-3812-49cf-9dbf-60c7b5d95a48\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.526388 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-sg-core-conf-yaml\") pod \"592f368e-3812-49cf-9dbf-60c7b5d95a48\" (UID: \"592f368e-3812-49cf-9dbf-60c7b5d95a48\") " Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.533185 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "592f368e-3812-49cf-9dbf-60c7b5d95a48" (UID: "592f368e-3812-49cf-9dbf-60c7b5d95a48"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.535903 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "592f368e-3812-49cf-9dbf-60c7b5d95a48" (UID: "592f368e-3812-49cf-9dbf-60c7b5d95a48"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.542916 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-scripts" (OuterVolumeSpecName: "scripts") pod "592f368e-3812-49cf-9dbf-60c7b5d95a48" (UID: "592f368e-3812-49cf-9dbf-60c7b5d95a48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.547705 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/592f368e-3812-49cf-9dbf-60c7b5d95a48-kube-api-access-bzfxq" (OuterVolumeSpecName: "kube-api-access-bzfxq") pod "592f368e-3812-49cf-9dbf-60c7b5d95a48" (UID: "592f368e-3812-49cf-9dbf-60c7b5d95a48"). InnerVolumeSpecName "kube-api-access-bzfxq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.593869 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "592f368e-3812-49cf-9dbf-60c7b5d95a48" (UID: "592f368e-3812-49cf-9dbf-60c7b5d95a48"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.629959 5184 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.629996 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.630015 5184 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.630024 5184 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/592f368e-3812-49cf-9dbf-60c7b5d95a48-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.630033 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bzfxq\" (UniqueName: \"kubernetes.io/projected/592f368e-3812-49cf-9dbf-60c7b5d95a48-kube-api-access-bzfxq\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.660893 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "592f368e-3812-49cf-9dbf-60c7b5d95a48" (UID: "592f368e-3812-49cf-9dbf-60c7b5d95a48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.731542 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.765581 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-config-data" (OuterVolumeSpecName: "config-data") pod "592f368e-3812-49cf-9dbf-60c7b5d95a48" (UID: "592f368e-3812-49cf-9dbf-60c7b5d95a48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:02 crc kubenswrapper[5184]: I0312 17:10:02.834406 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/592f368e-3812-49cf-9dbf-60c7b5d95a48-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.477195 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb","Type":"ContainerStarted","Data":"1b9205c3ac6cf8041cd4eee9c8b46ccbc3e60d4092cd513fe0246364f0400329"} Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.480260 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/cinder-api-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.486327 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"79cef421-d05c-4274-8c92-337f4b818bff","Type":"ContainerStarted","Data":"a9528b4812ec609bae041aca5012c12ca66c4399dfeef6e6fe02f8cdcca43657"} Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.503850 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f0e3724-623c-4453-8b13-623e9daf508d","Type":"ContainerStarted","Data":"11827942d14667abc9cbfd2065bd7fa2ca04aabcf31f2a91a5d59ce1c4cb03c1"} Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.507931 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.507911237 podStartE2EDuration="5.507911237s" podCreationTimestamp="2026-03-12 17:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:03.496419837 +0000 UTC m=+1146.037731176" watchObservedRunningTime="2026-03-12 17:10:03.507911237 +0000 UTC m=+1146.049222576" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.531474 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"592f368e-3812-49cf-9dbf-60c7b5d95a48","Type":"ContainerDied","Data":"a033c65dff19d3698c62121f1c05062807f602f5dc9e2aee29fdd72bc3e0d2c2"} Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.531532 5184 scope.go:117] "RemoveContainer" containerID="4687c868dcc72a507bf2dca1e7168c07e98842bc213c55d7e35d62194545da54" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.531575 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.534013 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.533999917 podStartE2EDuration="4.533999917s" podCreationTimestamp="2026-03-12 17:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:03.526979617 +0000 UTC m=+1146.068290956" watchObservedRunningTime="2026-03-12 17:10:03.533999917 +0000 UTC m=+1146.075311256" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.554169 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.55415198 podStartE2EDuration="4.55415198s" podCreationTimestamp="2026-03-12 17:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:03.553965764 +0000 UTC m=+1146.095277113" watchObservedRunningTime="2026-03-12 17:10:03.55415198 +0000 UTC m=+1146.095463319" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.587916 5184 scope.go:117] "RemoveContainer" containerID="deeba073fb066d813322cf3b6cf33314631f25b69c9378a10b547890618d0336" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.602464 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.633668 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.653787 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.657598 5184 scope.go:117] "RemoveContainer" containerID="675268e2f75e910b61f528721aeb98093b0a8a3e2289f426f2296c782d7879ff" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.683705 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="ceilometer-central-agent" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.683752 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="ceilometer-central-agent" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.683768 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="ceilometer-notification-agent" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.683779 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="ceilometer-notification-agent" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.683805 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="proxy-httpd" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.683813 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="proxy-httpd" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.683836 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="sg-core" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.683842 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="sg-core" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.684216 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="ceilometer-central-agent" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.684236 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="sg-core" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.684248 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="proxy-httpd" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.684258 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" containerName="ceilometer-notification-agent" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.699926 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.700359 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.704040 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.704920 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.740935 5184 scope.go:117] "RemoveContainer" containerID="b0fd85e2dfb2f4f7d4749f448e7a2917ce93ca64c802fae077c8264f15f9874a" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.859861 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.859914 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-config-data\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.859942 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.859960 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-run-httpd\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.860123 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8m6w\" (UniqueName: \"kubernetes.io/projected/fa611206-15cc-42c0-9025-17c42d22ec36-kube-api-access-f8m6w\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.860628 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-log-httpd\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.860726 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-scripts\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.962073 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-config-data\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.962133 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.962154 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-run-httpd\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.962187 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8m6w\" (UniqueName: \"kubernetes.io/projected/fa611206-15cc-42c0-9025-17c42d22ec36-kube-api-access-f8m6w\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.962285 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-log-httpd\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.962312 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-scripts\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.962349 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.963954 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-log-httpd\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.964364 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-run-httpd\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.969440 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-config-data\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.971054 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-scripts\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.971856 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.985046 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:03 crc kubenswrapper[5184]: I0312 17:10:03.985726 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8m6w\" (UniqueName: \"kubernetes.io/projected/fa611206-15cc-42c0-9025-17c42d22ec36-kube-api-access-f8m6w\") pod \"ceilometer-0\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " pod="openstack/ceilometer-0" Mar 12 17:10:04 crc kubenswrapper[5184]: I0312 17:10:04.023268 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:04 crc kubenswrapper[5184]: I0312 17:10:04.413216 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="592f368e-3812-49cf-9dbf-60c7b5d95a48" path="/var/lib/kubelet/pods/592f368e-3812-49cf-9dbf-60c7b5d95a48/volumes" Mar 12 17:10:04 crc kubenswrapper[5184]: I0312 17:10:04.532575 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:04 crc kubenswrapper[5184]: W0312 17:10:04.537046 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa611206_15cc_42c0_9025_17c42d22ec36.slice/crio-85eb5e851809de2ac6b44f9c13fcb8b99158d0f0f3128aa21bd1c51545b921cf WatchSource:0}: Error finding container 85eb5e851809de2ac6b44f9c13fcb8b99158d0f0f3128aa21bd1c51545b921cf: Status 404 returned error can't find the container with id 85eb5e851809de2ac6b44f9c13fcb8b99158d0f0f3128aa21bd1c51545b921cf Mar 12 17:10:04 crc kubenswrapper[5184]: I0312 17:10:04.543981 5184 generic.go:358] "Generic (PLEG): container finished" podID="65b7c2ee-47aa-47cb-9360-432c7da6513c" containerID="e99c672f28b88118c683cf8842e8dc7a203447c475f4d5aa07ac60bf0d11aa55" exitCode=0 Mar 12 17:10:04 crc kubenswrapper[5184]: I0312 17:10:04.544285 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555590-r6pk7" event={"ID":"65b7c2ee-47aa-47cb-9360-432c7da6513c","Type":"ContainerDied","Data":"e99c672f28b88118c683cf8842e8dc7a203447c475f4d5aa07ac60bf0d11aa55"} Mar 12 17:10:05 crc kubenswrapper[5184]: I0312 17:10:05.602794 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa611206-15cc-42c0-9025-17c42d22ec36","Type":"ContainerStarted","Data":"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1"} Mar 12 17:10:05 crc kubenswrapper[5184]: I0312 17:10:05.603263 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa611206-15cc-42c0-9025-17c42d22ec36","Type":"ContainerStarted","Data":"85eb5e851809de2ac6b44f9c13fcb8b99158d0f0f3128aa21bd1c51545b921cf"} Mar 12 17:10:05 crc kubenswrapper[5184]: I0312 17:10:05.835985 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:09 crc kubenswrapper[5184]: I0312 17:10:09.689275 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 17:10:09 crc kubenswrapper[5184]: I0312 17:10:09.689876 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 17:10:09 crc kubenswrapper[5184]: I0312 17:10:09.703729 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 17:10:09 crc kubenswrapper[5184]: I0312 17:10:09.703770 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 17:10:09 crc kubenswrapper[5184]: I0312 17:10:09.739891 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 17:10:09 crc kubenswrapper[5184]: I0312 17:10:09.751145 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 17:10:09 crc kubenswrapper[5184]: I0312 17:10:09.751250 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 17:10:09 crc kubenswrapper[5184]: I0312 17:10:09.764852 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.561355 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555590-r6pk7" Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.615352 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq5sf\" (UniqueName: \"kubernetes.io/projected/65b7c2ee-47aa-47cb-9360-432c7da6513c-kube-api-access-bq5sf\") pod \"65b7c2ee-47aa-47cb-9360-432c7da6513c\" (UID: \"65b7c2ee-47aa-47cb-9360-432c7da6513c\") " Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.634343 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b7c2ee-47aa-47cb-9360-432c7da6513c-kube-api-access-bq5sf" (OuterVolumeSpecName: "kube-api-access-bq5sf") pod "65b7c2ee-47aa-47cb-9360-432c7da6513c" (UID: "65b7c2ee-47aa-47cb-9360-432c7da6513c"). InnerVolumeSpecName "kube-api-access-bq5sf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.667759 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555590-r6pk7" event={"ID":"65b7c2ee-47aa-47cb-9360-432c7da6513c","Type":"ContainerDied","Data":"ebac5a4370087f35b0ad3d9ddc765d407b088b9c749b2202a3e3441d612ca35a"} Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.667866 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebac5a4370087f35b0ad3d9ddc765d407b088b9c749b2202a3e3441d612ca35a" Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.667875 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555590-r6pk7" Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.668582 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.668635 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.668652 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-internal-api-0" Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.668663 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/glance-default-external-api-0" Mar 12 17:10:10 crc kubenswrapper[5184]: I0312 17:10:10.717839 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bq5sf\" (UniqueName: \"kubernetes.io/projected/65b7c2ee-47aa-47cb-9360-432c7da6513c-kube-api-access-bq5sf\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:11 crc kubenswrapper[5184]: I0312 17:10:11.647875 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555584-82kk8"] Mar 12 17:10:11 crc kubenswrapper[5184]: I0312 17:10:11.654802 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555584-82kk8"] Mar 12 17:10:11 crc kubenswrapper[5184]: I0312 17:10:11.711091 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:10:12 crc kubenswrapper[5184]: I0312 17:10:12.408935 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d0157e-15d0-42fc-a50e-1b5688578404" path="/var/lib/kubelet/pods/75d0157e-15d0-42fc-a50e-1b5688578404/volumes" Mar 12 17:10:12 crc kubenswrapper[5184]: I0312 17:10:12.695788 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa611206-15cc-42c0-9025-17c42d22ec36","Type":"ContainerStarted","Data":"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0"} Mar 12 17:10:12 crc kubenswrapper[5184]: I0312 17:10:12.697837 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n22b8" event={"ID":"2a7a82d0-151a-40b3-86b4-79aff3a3b0be","Type":"ContainerStarted","Data":"9b12b25da6650591dec1111324062f5144b3e13fa578769a3ae5102967141a09"} Mar 12 17:10:12 crc kubenswrapper[5184]: I0312 17:10:12.718884 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-n22b8" podStartSLOduration=9.133110675 podStartE2EDuration="21.718864513s" podCreationTimestamp="2026-03-12 17:09:51 +0000 UTC" firstStartedPulling="2026-03-12 17:09:58.899257835 +0000 UTC m=+1141.440569174" lastFinishedPulling="2026-03-12 17:10:11.485011673 +0000 UTC m=+1154.026323012" observedRunningTime="2026-03-12 17:10:12.714807457 +0000 UTC m=+1155.256118796" watchObservedRunningTime="2026-03-12 17:10:12.718864513 +0000 UTC m=+1155.260175852" Mar 12 17:10:12 crc kubenswrapper[5184]: I0312 17:10:12.906911 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b65568c5b-pr7s4" Mar 12 17:10:12 crc kubenswrapper[5184]: I0312 17:10:12.992778 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-f484d5cc6-qld48"] Mar 12 17:10:12 crc kubenswrapper[5184]: I0312 17:10:12.993449 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/placement-f484d5cc6-qld48" podUID="9118cf9a-dccb-4e2b-8438-de0d717382a1" containerName="placement-log" containerID="cri-o://9eb64916596a65399b125bdddf16d978075ffdcc78ce9370906ae117e986ae01" gracePeriod=30 Mar 12 17:10:12 crc kubenswrapper[5184]: I0312 17:10:12.993929 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/placement-f484d5cc6-qld48" podUID="9118cf9a-dccb-4e2b-8438-de0d717382a1" containerName="placement-api" containerID="cri-o://f6620ab7ebe09dc3a4be223e1ac6ce9064ba8097dc3bd65a22531a714f802fab" gracePeriod=30 Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.105894 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.106668 5184 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.152316 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.152942 5184 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.169350 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.250944 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.582222 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.673410 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-combined-ca-bundle\") pod \"5c4af3c8-3189-41d2-9709-336561190b17\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.673550 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-ovndb-tls-certs\") pod \"5c4af3c8-3189-41d2-9709-336561190b17\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.673586 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-config\") pod \"5c4af3c8-3189-41d2-9709-336561190b17\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.673646 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzz4j\" (UniqueName: \"kubernetes.io/projected/5c4af3c8-3189-41d2-9709-336561190b17-kube-api-access-kzz4j\") pod \"5c4af3c8-3189-41d2-9709-336561190b17\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.673676 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-httpd-config\") pod \"5c4af3c8-3189-41d2-9709-336561190b17\" (UID: \"5c4af3c8-3189-41d2-9709-336561190b17\") " Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.687335 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5c4af3c8-3189-41d2-9709-336561190b17" (UID: "5c4af3c8-3189-41d2-9709-336561190b17"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.690585 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4af3c8-3189-41d2-9709-336561190b17-kube-api-access-kzz4j" (OuterVolumeSpecName: "kube-api-access-kzz4j") pod "5c4af3c8-3189-41d2-9709-336561190b17" (UID: "5c4af3c8-3189-41d2-9709-336561190b17"). InnerVolumeSpecName "kube-api-access-kzz4j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.717025 5184 generic.go:358] "Generic (PLEG): container finished" podID="5c4af3c8-3189-41d2-9709-336561190b17" containerID="98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb" exitCode=0 Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.718467 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869b7dc84d-67g2c" event={"ID":"5c4af3c8-3189-41d2-9709-336561190b17","Type":"ContainerDied","Data":"98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb"} Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.720792 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869b7dc84d-67g2c" event={"ID":"5c4af3c8-3189-41d2-9709-336561190b17","Type":"ContainerDied","Data":"69fa3abf61438d4466c6d5dc0bf7516773400dafab1b9851080e68dedc08a11e"} Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.720811 5184 scope.go:117] "RemoveContainer" containerID="2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.718582 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-869b7dc84d-67g2c" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.755104 5184 generic.go:358] "Generic (PLEG): container finished" podID="9118cf9a-dccb-4e2b-8438-de0d717382a1" containerID="9eb64916596a65399b125bdddf16d978075ffdcc78ce9370906ae117e986ae01" exitCode=143 Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.755207 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f484d5cc6-qld48" event={"ID":"9118cf9a-dccb-4e2b-8438-de0d717382a1","Type":"ContainerDied","Data":"9eb64916596a65399b125bdddf16d978075ffdcc78ce9370906ae117e986ae01"} Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.764766 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa611206-15cc-42c0-9025-17c42d22ec36","Type":"ContainerStarted","Data":"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8"} Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.770713 5184 scope.go:117] "RemoveContainer" containerID="98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.776215 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kzz4j\" (UniqueName: \"kubernetes.io/projected/5c4af3c8-3189-41d2-9709-336561190b17-kube-api-access-kzz4j\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.776449 5184 reconciler_common.go:299] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.779519 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c4af3c8-3189-41d2-9709-336561190b17" (UID: "5c4af3c8-3189-41d2-9709-336561190b17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.801794 5184 scope.go:117] "RemoveContainer" containerID="2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d" Mar 12 17:10:13 crc kubenswrapper[5184]: E0312 17:10:13.802484 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d\": container with ID starting with 2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d not found: ID does not exist" containerID="2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.802523 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d"} err="failed to get container status \"2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d\": rpc error: code = NotFound desc = could not find container \"2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d\": container with ID starting with 2f85ea6d5b15c9b9f11ee07607c123482b5a31b6e87f37b2cf8ccf74c187539d not found: ID does not exist" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.802550 5184 scope.go:117] "RemoveContainer" containerID="98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb" Mar 12 17:10:13 crc kubenswrapper[5184]: E0312 17:10:13.802866 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb\": container with ID starting with 98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb not found: ID does not exist" containerID="98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.802888 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb"} err="failed to get container status \"98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb\": rpc error: code = NotFound desc = could not find container \"98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb\": container with ID starting with 98cb3d5f89afecd999fa979da011c3f50f1c506fdb104ae79338eee3eb7a8bbb not found: ID does not exist" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.817825 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5c4af3c8-3189-41d2-9709-336561190b17" (UID: "5c4af3c8-3189-41d2-9709-336561190b17"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.828603 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-config" (OuterVolumeSpecName: "config") pod "5c4af3c8-3189-41d2-9709-336561190b17" (UID: "5c4af3c8-3189-41d2-9709-336561190b17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.879145 5184 reconciler_common.go:299] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.879365 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.879457 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4af3c8-3189-41d2-9709-336561190b17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:13 crc kubenswrapper[5184]: I0312 17:10:13.927620 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 12 17:10:14 crc kubenswrapper[5184]: I0312 17:10:14.080422 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-869b7dc84d-67g2c"] Mar 12 17:10:14 crc kubenswrapper[5184]: I0312 17:10:14.088275 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-869b7dc84d-67g2c"] Mar 12 17:10:14 crc kubenswrapper[5184]: I0312 17:10:14.409902 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c4af3c8-3189-41d2-9709-336561190b17" path="/var/lib/kubelet/pods/5c4af3c8-3189-41d2-9709-336561190b17/volumes" Mar 12 17:10:15 crc kubenswrapper[5184]: I0312 17:10:15.813746 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="ceilometer-central-agent" containerID="cri-o://49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1" gracePeriod=30 Mar 12 17:10:15 crc kubenswrapper[5184]: I0312 17:10:15.813798 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa611206-15cc-42c0-9025-17c42d22ec36","Type":"ContainerStarted","Data":"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126"} Mar 12 17:10:15 crc kubenswrapper[5184]: I0312 17:10:15.814883 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Mar 12 17:10:15 crc kubenswrapper[5184]: I0312 17:10:15.813956 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="sg-core" containerID="cri-o://2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8" gracePeriod=30 Mar 12 17:10:15 crc kubenswrapper[5184]: I0312 17:10:15.813974 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="ceilometer-notification-agent" containerID="cri-o://e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0" gracePeriod=30 Mar 12 17:10:15 crc kubenswrapper[5184]: I0312 17:10:15.813903 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="proxy-httpd" containerID="cri-o://95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126" gracePeriod=30 Mar 12 17:10:15 crc kubenswrapper[5184]: I0312 17:10:15.847779 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.271359491 podStartE2EDuration="12.847758402s" podCreationTimestamp="2026-03-12 17:10:03 +0000 UTC" firstStartedPulling="2026-03-12 17:10:04.53867963 +0000 UTC m=+1147.079990969" lastFinishedPulling="2026-03-12 17:10:15.115078541 +0000 UTC m=+1157.656389880" observedRunningTime="2026-03-12 17:10:15.842003412 +0000 UTC m=+1158.383314771" watchObservedRunningTime="2026-03-12 17:10:15.847758402 +0000 UTC m=+1158.389069741" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.725835 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.768355 5184 scope.go:117] "RemoveContainer" containerID="bf586c7654695d0f359c042740b6e675c9b3800d76d81f9cc06c09a27af9c298" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.839295 5184 generic.go:358] "Generic (PLEG): container finished" podID="9118cf9a-dccb-4e2b-8438-de0d717382a1" containerID="f6620ab7ebe09dc3a4be223e1ac6ce9064ba8097dc3bd65a22531a714f802fab" exitCode=0 Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.839873 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f484d5cc6-qld48" event={"ID":"9118cf9a-dccb-4e2b-8438-de0d717382a1","Type":"ContainerDied","Data":"f6620ab7ebe09dc3a4be223e1ac6ce9064ba8097dc3bd65a22531a714f802fab"} Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.843336 5184 generic.go:358] "Generic (PLEG): container finished" podID="fa611206-15cc-42c0-9025-17c42d22ec36" containerID="95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126" exitCode=0 Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.843367 5184 generic.go:358] "Generic (PLEG): container finished" podID="fa611206-15cc-42c0-9025-17c42d22ec36" containerID="2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8" exitCode=2 Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.843390 5184 generic.go:358] "Generic (PLEG): container finished" podID="fa611206-15cc-42c0-9025-17c42d22ec36" containerID="e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0" exitCode=0 Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.843396 5184 generic.go:358] "Generic (PLEG): container finished" podID="fa611206-15cc-42c0-9025-17c42d22ec36" containerID="49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1" exitCode=0 Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.843886 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa611206-15cc-42c0-9025-17c42d22ec36","Type":"ContainerDied","Data":"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126"} Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.843939 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa611206-15cc-42c0-9025-17c42d22ec36","Type":"ContainerDied","Data":"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8"} Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.843951 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa611206-15cc-42c0-9025-17c42d22ec36","Type":"ContainerDied","Data":"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0"} Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.843961 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa611206-15cc-42c0-9025-17c42d22ec36","Type":"ContainerDied","Data":"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1"} Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.843969 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa611206-15cc-42c0-9025-17c42d22ec36","Type":"ContainerDied","Data":"85eb5e851809de2ac6b44f9c13fcb8b99158d0f0f3128aa21bd1c51545b921cf"} Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.843985 5184 scope.go:117] "RemoveContainer" containerID="95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.844193 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.868066 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-sg-core-conf-yaml\") pod \"fa611206-15cc-42c0-9025-17c42d22ec36\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.868229 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8m6w\" (UniqueName: \"kubernetes.io/projected/fa611206-15cc-42c0-9025-17c42d22ec36-kube-api-access-f8m6w\") pod \"fa611206-15cc-42c0-9025-17c42d22ec36\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.868297 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-combined-ca-bundle\") pod \"fa611206-15cc-42c0-9025-17c42d22ec36\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.868437 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-scripts\") pod \"fa611206-15cc-42c0-9025-17c42d22ec36\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.868497 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-log-httpd\") pod \"fa611206-15cc-42c0-9025-17c42d22ec36\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.869533 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fa611206-15cc-42c0-9025-17c42d22ec36" (UID: "fa611206-15cc-42c0-9025-17c42d22ec36"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.870266 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fa611206-15cc-42c0-9025-17c42d22ec36" (UID: "fa611206-15cc-42c0-9025-17c42d22ec36"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.869662 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-run-httpd\") pod \"fa611206-15cc-42c0-9025-17c42d22ec36\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.870356 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-config-data\") pod \"fa611206-15cc-42c0-9025-17c42d22ec36\" (UID: \"fa611206-15cc-42c0-9025-17c42d22ec36\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.871227 5184 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.871248 5184 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa611206-15cc-42c0-9025-17c42d22ec36-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.873229 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.874555 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-scripts" (OuterVolumeSpecName: "scripts") pod "fa611206-15cc-42c0-9025-17c42d22ec36" (UID: "fa611206-15cc-42c0-9025-17c42d22ec36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.875852 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa611206-15cc-42c0-9025-17c42d22ec36-kube-api-access-f8m6w" (OuterVolumeSpecName: "kube-api-access-f8m6w") pod "fa611206-15cc-42c0-9025-17c42d22ec36" (UID: "fa611206-15cc-42c0-9025-17c42d22ec36"). InnerVolumeSpecName "kube-api-access-f8m6w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.894270 5184 scope.go:117] "RemoveContainer" containerID="2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.928016 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fa611206-15cc-42c0-9025-17c42d22ec36" (UID: "fa611206-15cc-42c0-9025-17c42d22ec36"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.937798 5184 scope.go:117] "RemoveContainer" containerID="e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.961370 5184 scope.go:117] "RemoveContainer" containerID="49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.971793 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-config-data\") pod \"9118cf9a-dccb-4e2b-8438-de0d717382a1\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.971860 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-public-tls-certs\") pod \"9118cf9a-dccb-4e2b-8438-de0d717382a1\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.971898 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-scripts\") pod \"9118cf9a-dccb-4e2b-8438-de0d717382a1\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.972003 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9118cf9a-dccb-4e2b-8438-de0d717382a1-logs\") pod \"9118cf9a-dccb-4e2b-8438-de0d717382a1\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.972059 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-combined-ca-bundle\") pod \"9118cf9a-dccb-4e2b-8438-de0d717382a1\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.972198 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-internal-tls-certs\") pod \"9118cf9a-dccb-4e2b-8438-de0d717382a1\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.972227 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmtxn\" (UniqueName: \"kubernetes.io/projected/9118cf9a-dccb-4e2b-8438-de0d717382a1-kube-api-access-kmtxn\") pod \"9118cf9a-dccb-4e2b-8438-de0d717382a1\" (UID: \"9118cf9a-dccb-4e2b-8438-de0d717382a1\") " Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.972549 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8m6w\" (UniqueName: \"kubernetes.io/projected/fa611206-15cc-42c0-9025-17c42d22ec36-kube-api-access-f8m6w\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.972562 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.972571 5184 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.973331 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9118cf9a-dccb-4e2b-8438-de0d717382a1-logs" (OuterVolumeSpecName: "logs") pod "9118cf9a-dccb-4e2b-8438-de0d717382a1" (UID: "9118cf9a-dccb-4e2b-8438-de0d717382a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.976156 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9118cf9a-dccb-4e2b-8438-de0d717382a1-kube-api-access-kmtxn" (OuterVolumeSpecName: "kube-api-access-kmtxn") pod "9118cf9a-dccb-4e2b-8438-de0d717382a1" (UID: "9118cf9a-dccb-4e2b-8438-de0d717382a1"). InnerVolumeSpecName "kube-api-access-kmtxn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.978110 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-scripts" (OuterVolumeSpecName: "scripts") pod "9118cf9a-dccb-4e2b-8438-de0d717382a1" (UID: "9118cf9a-dccb-4e2b-8438-de0d717382a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.992485 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa611206-15cc-42c0-9025-17c42d22ec36" (UID: "fa611206-15cc-42c0-9025-17c42d22ec36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.997990 5184 scope.go:117] "RemoveContainer" containerID="95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126" Mar 12 17:10:16 crc kubenswrapper[5184]: E0312 17:10:16.998342 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126\": container with ID starting with 95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126 not found: ID does not exist" containerID="95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.998386 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126"} err="failed to get container status \"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126\": rpc error: code = NotFound desc = could not find container \"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126\": container with ID starting with 95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126 not found: ID does not exist" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.998406 5184 scope.go:117] "RemoveContainer" containerID="2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8" Mar 12 17:10:16 crc kubenswrapper[5184]: E0312 17:10:16.998765 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8\": container with ID starting with 2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8 not found: ID does not exist" containerID="2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.998785 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8"} err="failed to get container status \"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8\": rpc error: code = NotFound desc = could not find container \"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8\": container with ID starting with 2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8 not found: ID does not exist" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.998796 5184 scope.go:117] "RemoveContainer" containerID="e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0" Mar 12 17:10:16 crc kubenswrapper[5184]: E0312 17:10:16.999606 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0\": container with ID starting with e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0 not found: ID does not exist" containerID="e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.999628 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0"} err="failed to get container status \"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0\": rpc error: code = NotFound desc = could not find container \"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0\": container with ID starting with e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0 not found: ID does not exist" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.999642 5184 scope.go:117] "RemoveContainer" containerID="49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1" Mar 12 17:10:16 crc kubenswrapper[5184]: E0312 17:10:16.999842 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1\": container with ID starting with 49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1 not found: ID does not exist" containerID="49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.999867 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1"} err="failed to get container status \"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1\": rpc error: code = NotFound desc = could not find container \"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1\": container with ID starting with 49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1 not found: ID does not exist" Mar 12 17:10:16 crc kubenswrapper[5184]: I0312 17:10:16.999881 5184 scope.go:117] "RemoveContainer" containerID="95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.000652 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126"} err="failed to get container status \"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126\": rpc error: code = NotFound desc = could not find container \"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126\": container with ID starting with 95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.000691 5184 scope.go:117] "RemoveContainer" containerID="2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.000961 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8"} err="failed to get container status \"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8\": rpc error: code = NotFound desc = could not find container \"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8\": container with ID starting with 2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.000984 5184 scope.go:117] "RemoveContainer" containerID="e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.001455 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0"} err="failed to get container status \"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0\": rpc error: code = NotFound desc = could not find container \"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0\": container with ID starting with e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.001495 5184 scope.go:117] "RemoveContainer" containerID="49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.003968 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1"} err="failed to get container status \"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1\": rpc error: code = NotFound desc = could not find container \"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1\": container with ID starting with 49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.003996 5184 scope.go:117] "RemoveContainer" containerID="95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.004488 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126"} err="failed to get container status \"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126\": rpc error: code = NotFound desc = could not find container \"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126\": container with ID starting with 95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.004505 5184 scope.go:117] "RemoveContainer" containerID="2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.004711 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8"} err="failed to get container status \"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8\": rpc error: code = NotFound desc = could not find container \"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8\": container with ID starting with 2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.004728 5184 scope.go:117] "RemoveContainer" containerID="e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.004900 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0"} err="failed to get container status \"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0\": rpc error: code = NotFound desc = could not find container \"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0\": container with ID starting with e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.004916 5184 scope.go:117] "RemoveContainer" containerID="49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.005100 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1"} err="failed to get container status \"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1\": rpc error: code = NotFound desc = could not find container \"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1\": container with ID starting with 49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.005118 5184 scope.go:117] "RemoveContainer" containerID="95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.005313 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126"} err="failed to get container status \"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126\": rpc error: code = NotFound desc = could not find container \"95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126\": container with ID starting with 95b039e5c1b91875219ac53a759c88829e08d85fcf553ebc25457a265d797126 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.005328 5184 scope.go:117] "RemoveContainer" containerID="2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.005545 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8"} err="failed to get container status \"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8\": rpc error: code = NotFound desc = could not find container \"2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8\": container with ID starting with 2b7eb2a62f817ccd9091415e17f126e1376837a4a44c97038bc9ee25663c32e8 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.005561 5184 scope.go:117] "RemoveContainer" containerID="e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.005946 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0"} err="failed to get container status \"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0\": rpc error: code = NotFound desc = could not find container \"e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0\": container with ID starting with e5d3970b58bb3a27638255dc947f626628750d38127691e7b30950b4b08bbce0 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.005967 5184 scope.go:117] "RemoveContainer" containerID="49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.006174 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1"} err="failed to get container status \"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1\": rpc error: code = NotFound desc = could not find container \"49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1\": container with ID starting with 49facc8e8c7372d9984aac5853546c161cbf7f1f7f6dc93f3f61d45cce3fb8c1 not found: ID does not exist" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.019887 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-config-data" (OuterVolumeSpecName: "config-data") pod "fa611206-15cc-42c0-9025-17c42d22ec36" (UID: "fa611206-15cc-42c0-9025-17c42d22ec36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.026533 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-config-data" (OuterVolumeSpecName: "config-data") pod "9118cf9a-dccb-4e2b-8438-de0d717382a1" (UID: "9118cf9a-dccb-4e2b-8438-de0d717382a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.027828 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9118cf9a-dccb-4e2b-8438-de0d717382a1" (UID: "9118cf9a-dccb-4e2b-8438-de0d717382a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.074429 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.074459 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.074468 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9118cf9a-dccb-4e2b-8438-de0d717382a1-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.074475 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.074485 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.074492 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa611206-15cc-42c0-9025-17c42d22ec36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.074500 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kmtxn\" (UniqueName: \"kubernetes.io/projected/9118cf9a-dccb-4e2b-8438-de0d717382a1-kube-api-access-kmtxn\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.092493 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9118cf9a-dccb-4e2b-8438-de0d717382a1" (UID: "9118cf9a-dccb-4e2b-8438-de0d717382a1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.094016 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9118cf9a-dccb-4e2b-8438-de0d717382a1" (UID: "9118cf9a-dccb-4e2b-8438-de0d717382a1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.176430 5184 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.176866 5184 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9118cf9a-dccb-4e2b-8438-de0d717382a1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.231015 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.238672 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.253400 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254333 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b7c2ee-47aa-47cb-9360-432c7da6513c" containerName="oc" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254351 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b7c2ee-47aa-47cb-9360-432c7da6513c" containerName="oc" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254387 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="ceilometer-central-agent" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254393 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="ceilometer-central-agent" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254412 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c4af3c8-3189-41d2-9709-336561190b17" containerName="neutron-httpd" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254419 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4af3c8-3189-41d2-9709-336561190b17" containerName="neutron-httpd" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254435 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9118cf9a-dccb-4e2b-8438-de0d717382a1" containerName="placement-api" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254441 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="9118cf9a-dccb-4e2b-8438-de0d717382a1" containerName="placement-api" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254465 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="proxy-httpd" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254471 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="proxy-httpd" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254484 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c4af3c8-3189-41d2-9709-336561190b17" containerName="neutron-api" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254489 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4af3c8-3189-41d2-9709-336561190b17" containerName="neutron-api" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254496 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="sg-core" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254502 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="sg-core" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254512 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="ceilometer-notification-agent" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254519 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="ceilometer-notification-agent" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254531 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9118cf9a-dccb-4e2b-8438-de0d717382a1" containerName="placement-log" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254536 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="9118cf9a-dccb-4e2b-8438-de0d717382a1" containerName="placement-log" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254677 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="65b7c2ee-47aa-47cb-9360-432c7da6513c" containerName="oc" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254687 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="proxy-httpd" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254704 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c4af3c8-3189-41d2-9709-336561190b17" containerName="neutron-httpd" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254712 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="9118cf9a-dccb-4e2b-8438-de0d717382a1" containerName="placement-api" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254721 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="sg-core" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254731 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="ceilometer-notification-agent" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254741 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="9118cf9a-dccb-4e2b-8438-de0d717382a1" containerName="placement-log" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254751 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" containerName="ceilometer-central-agent" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.254757 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c4af3c8-3189-41d2-9709-336561190b17" containerName="neutron-api" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.269781 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.269916 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.272016 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.273702 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.379961 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxhpz\" (UniqueName: \"kubernetes.io/projected/a419b263-d0db-40f1-a862-027595731917-kube-api-access-rxhpz\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.380134 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-run-httpd\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.380192 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-log-httpd\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.380330 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.380577 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-config-data\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.380703 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-scripts\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.380779 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.482626 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-config-data\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.482711 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-scripts\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.482746 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.482823 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxhpz\" (UniqueName: \"kubernetes.io/projected/a419b263-d0db-40f1-a862-027595731917-kube-api-access-rxhpz\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.482887 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-run-httpd\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.482921 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-log-httpd\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.482951 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.484227 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-run-httpd\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.484793 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-log-httpd\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.489390 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.491260 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-scripts\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.495900 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.500444 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-config-data\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.505092 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxhpz\" (UniqueName: \"kubernetes.io/projected/a419b263-d0db-40f1-a862-027595731917-kube-api-access-rxhpz\") pod \"ceilometer-0\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.587517 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.863525 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f484d5cc6-qld48" event={"ID":"9118cf9a-dccb-4e2b-8438-de0d717382a1","Type":"ContainerDied","Data":"8f196bf1e6f8f9e6a0cecfff73c46a82c3e10879c280a61ac0709a7b2882a57c"} Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.863820 5184 scope.go:117] "RemoveContainer" containerID="f6620ab7ebe09dc3a4be223e1ac6ce9064ba8097dc3bd65a22531a714f802fab" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.863638 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f484d5cc6-qld48" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.913833 5184 scope.go:117] "RemoveContainer" containerID="9eb64916596a65399b125bdddf16d978075ffdcc78ce9370906ae117e986ae01" Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.926227 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-f484d5cc6-qld48"] Mar 12 17:10:17 crc kubenswrapper[5184]: I0312 17:10:17.940781 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f484d5cc6-qld48"] Mar 12 17:10:18 crc kubenswrapper[5184]: I0312 17:10:18.072666 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:18 crc kubenswrapper[5184]: W0312 17:10:18.077313 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda419b263_d0db_40f1_a862_027595731917.slice/crio-5af9540be83eafa63a843c72682ba49d381a70c011fa1af740b019dd834a0220 WatchSource:0}: Error finding container 5af9540be83eafa63a843c72682ba49d381a70c011fa1af740b019dd834a0220: Status 404 returned error can't find the container with id 5af9540be83eafa63a843c72682ba49d381a70c011fa1af740b019dd834a0220 Mar 12 17:10:18 crc kubenswrapper[5184]: I0312 17:10:18.437741 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9118cf9a-dccb-4e2b-8438-de0d717382a1" path="/var/lib/kubelet/pods/9118cf9a-dccb-4e2b-8438-de0d717382a1/volumes" Mar 12 17:10:18 crc kubenswrapper[5184]: I0312 17:10:18.440698 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa611206-15cc-42c0-9025-17c42d22ec36" path="/var/lib/kubelet/pods/fa611206-15cc-42c0-9025-17c42d22ec36/volumes" Mar 12 17:10:18 crc kubenswrapper[5184]: I0312 17:10:18.873704 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a419b263-d0db-40f1-a862-027595731917","Type":"ContainerStarted","Data":"5af9540be83eafa63a843c72682ba49d381a70c011fa1af740b019dd834a0220"} Mar 12 17:10:21 crc kubenswrapper[5184]: I0312 17:10:21.902788 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a419b263-d0db-40f1-a862-027595731917","Type":"ContainerStarted","Data":"98e2c6a350e30fed727d4b1decdc981377842856f1d2d3b532db5bf316fb7213"} Mar 12 17:10:21 crc kubenswrapper[5184]: I0312 17:10:21.903288 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a419b263-d0db-40f1-a862-027595731917","Type":"ContainerStarted","Data":"161634025e2e294517402973a94374132631ce713a732744d7cdb3fae6ae8772"} Mar 12 17:10:22 crc kubenswrapper[5184]: I0312 17:10:22.848604 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:22 crc kubenswrapper[5184]: I0312 17:10:22.925320 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a419b263-d0db-40f1-a862-027595731917","Type":"ContainerStarted","Data":"4da79caa3365e8991748944548b55986abb71f9127a45240530e849efd2cc26a"} Mar 12 17:10:24 crc kubenswrapper[5184]: I0312 17:10:24.953876 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a419b263-d0db-40f1-a862-027595731917","Type":"ContainerStarted","Data":"096f5c84117191047fd06166c50e8b3dc3d0b9de76314786c76c79f524c452f1"} Mar 12 17:10:24 crc kubenswrapper[5184]: I0312 17:10:24.954394 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="ceilometer-central-agent" containerID="cri-o://161634025e2e294517402973a94374132631ce713a732744d7cdb3fae6ae8772" gracePeriod=30 Mar 12 17:10:24 crc kubenswrapper[5184]: I0312 17:10:24.954444 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Mar 12 17:10:24 crc kubenswrapper[5184]: I0312 17:10:24.954564 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="proxy-httpd" containerID="cri-o://096f5c84117191047fd06166c50e8b3dc3d0b9de76314786c76c79f524c452f1" gracePeriod=30 Mar 12 17:10:24 crc kubenswrapper[5184]: I0312 17:10:24.954605 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="sg-core" containerID="cri-o://4da79caa3365e8991748944548b55986abb71f9127a45240530e849efd2cc26a" gracePeriod=30 Mar 12 17:10:24 crc kubenswrapper[5184]: I0312 17:10:24.954640 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="ceilometer-notification-agent" containerID="cri-o://98e2c6a350e30fed727d4b1decdc981377842856f1d2d3b532db5bf316fb7213" gracePeriod=30 Mar 12 17:10:24 crc kubenswrapper[5184]: I0312 17:10:24.985899 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.15940033 podStartE2EDuration="7.985877921s" podCreationTimestamp="2026-03-12 17:10:17 +0000 UTC" firstStartedPulling="2026-03-12 17:10:18.080007619 +0000 UTC m=+1160.621318958" lastFinishedPulling="2026-03-12 17:10:23.90648521 +0000 UTC m=+1166.447796549" observedRunningTime="2026-03-12 17:10:24.978697255 +0000 UTC m=+1167.520008614" watchObservedRunningTime="2026-03-12 17:10:24.985877921 +0000 UTC m=+1167.527189280" Mar 12 17:10:25 crc kubenswrapper[5184]: I0312 17:10:25.964835 5184 generic.go:358] "Generic (PLEG): container finished" podID="2a7a82d0-151a-40b3-86b4-79aff3a3b0be" containerID="9b12b25da6650591dec1111324062f5144b3e13fa578769a3ae5102967141a09" exitCode=0 Mar 12 17:10:25 crc kubenswrapper[5184]: I0312 17:10:25.964946 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n22b8" event={"ID":"2a7a82d0-151a-40b3-86b4-79aff3a3b0be","Type":"ContainerDied","Data":"9b12b25da6650591dec1111324062f5144b3e13fa578769a3ae5102967141a09"} Mar 12 17:10:25 crc kubenswrapper[5184]: I0312 17:10:25.968756 5184 generic.go:358] "Generic (PLEG): container finished" podID="a419b263-d0db-40f1-a862-027595731917" containerID="096f5c84117191047fd06166c50e8b3dc3d0b9de76314786c76c79f524c452f1" exitCode=0 Mar 12 17:10:25 crc kubenswrapper[5184]: I0312 17:10:25.968789 5184 generic.go:358] "Generic (PLEG): container finished" podID="a419b263-d0db-40f1-a862-027595731917" containerID="4da79caa3365e8991748944548b55986abb71f9127a45240530e849efd2cc26a" exitCode=2 Mar 12 17:10:25 crc kubenswrapper[5184]: I0312 17:10:25.968799 5184 generic.go:358] "Generic (PLEG): container finished" podID="a419b263-d0db-40f1-a862-027595731917" containerID="98e2c6a350e30fed727d4b1decdc981377842856f1d2d3b532db5bf316fb7213" exitCode=0 Mar 12 17:10:25 crc kubenswrapper[5184]: I0312 17:10:25.968991 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a419b263-d0db-40f1-a862-027595731917","Type":"ContainerDied","Data":"096f5c84117191047fd06166c50e8b3dc3d0b9de76314786c76c79f524c452f1"} Mar 12 17:10:25 crc kubenswrapper[5184]: I0312 17:10:25.969103 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a419b263-d0db-40f1-a862-027595731917","Type":"ContainerDied","Data":"4da79caa3365e8991748944548b55986abb71f9127a45240530e849efd2cc26a"} Mar 12 17:10:25 crc kubenswrapper[5184]: I0312 17:10:25.969173 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a419b263-d0db-40f1-a862-027595731917","Type":"ContainerDied","Data":"98e2c6a350e30fed727d4b1decdc981377842856f1d2d3b532db5bf316fb7213"} Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.461251 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.542595 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-scripts\") pod \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.542728 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-combined-ca-bundle\") pod \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.542789 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-config-data\") pod \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.542821 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhdcg\" (UniqueName: \"kubernetes.io/projected/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-kube-api-access-dhdcg\") pod \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\" (UID: \"2a7a82d0-151a-40b3-86b4-79aff3a3b0be\") " Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.551214 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-kube-api-access-dhdcg" (OuterVolumeSpecName: "kube-api-access-dhdcg") pod "2a7a82d0-151a-40b3-86b4-79aff3a3b0be" (UID: "2a7a82d0-151a-40b3-86b4-79aff3a3b0be"). InnerVolumeSpecName "kube-api-access-dhdcg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.551491 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-scripts" (OuterVolumeSpecName: "scripts") pod "2a7a82d0-151a-40b3-86b4-79aff3a3b0be" (UID: "2a7a82d0-151a-40b3-86b4-79aff3a3b0be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.570854 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a7a82d0-151a-40b3-86b4-79aff3a3b0be" (UID: "2a7a82d0-151a-40b3-86b4-79aff3a3b0be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.588567 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-config-data" (OuterVolumeSpecName: "config-data") pod "2a7a82d0-151a-40b3-86b4-79aff3a3b0be" (UID: "2a7a82d0-151a-40b3-86b4-79aff3a3b0be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.645059 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.645101 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.645117 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dhdcg\" (UniqueName: \"kubernetes.io/projected/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-kube-api-access-dhdcg\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.645130 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7a82d0-151a-40b3-86b4-79aff3a3b0be-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.996009 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n22b8" event={"ID":"2a7a82d0-151a-40b3-86b4-79aff3a3b0be","Type":"ContainerDied","Data":"d426490afa44bf22b0150a67e9f648e0151f34760602c63db56389dd83df25b9"} Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.996600 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d426490afa44bf22b0150a67e9f648e0151f34760602c63db56389dd83df25b9" Mar 12 17:10:27 crc kubenswrapper[5184]: I0312 17:10:27.996956 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n22b8" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.082742 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.084124 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a7a82d0-151a-40b3-86b4-79aff3a3b0be" containerName="nova-cell0-conductor-db-sync" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.084145 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7a82d0-151a-40b3-86b4-79aff3a3b0be" containerName="nova-cell0-conductor-db-sync" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.084393 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a7a82d0-151a-40b3-86b4-79aff3a3b0be" containerName="nova-cell0-conductor-db-sync" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.093614 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.097861 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.105313 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-conductor-config-data\"" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.105862 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-nova-dockercfg-lwlqf\"" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.155254 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d274afd-6ab1-4652-8093-c3941b617a98-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4d274afd-6ab1-4652-8093-c3941b617a98\") " pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.155305 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8qsd\" (UniqueName: \"kubernetes.io/projected/4d274afd-6ab1-4652-8093-c3941b617a98-kube-api-access-s8qsd\") pod \"nova-cell0-conductor-0\" (UID: \"4d274afd-6ab1-4652-8093-c3941b617a98\") " pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.155477 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d274afd-6ab1-4652-8093-c3941b617a98-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4d274afd-6ab1-4652-8093-c3941b617a98\") " pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.256822 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d274afd-6ab1-4652-8093-c3941b617a98-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4d274afd-6ab1-4652-8093-c3941b617a98\") " pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.257298 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d274afd-6ab1-4652-8093-c3941b617a98-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4d274afd-6ab1-4652-8093-c3941b617a98\") " pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.257322 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8qsd\" (UniqueName: \"kubernetes.io/projected/4d274afd-6ab1-4652-8093-c3941b617a98-kube-api-access-s8qsd\") pod \"nova-cell0-conductor-0\" (UID: \"4d274afd-6ab1-4652-8093-c3941b617a98\") " pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.263759 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d274afd-6ab1-4652-8093-c3941b617a98-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4d274afd-6ab1-4652-8093-c3941b617a98\") " pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.280973 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8qsd\" (UniqueName: \"kubernetes.io/projected/4d274afd-6ab1-4652-8093-c3941b617a98-kube-api-access-s8qsd\") pod \"nova-cell0-conductor-0\" (UID: \"4d274afd-6ab1-4652-8093-c3941b617a98\") " pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.281145 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d274afd-6ab1-4652-8093-c3941b617a98-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4d274afd-6ab1-4652-8093-c3941b617a98\") " pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.425604 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:28 crc kubenswrapper[5184]: I0312 17:10:28.926029 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 17:10:28 crc kubenswrapper[5184]: W0312 17:10:28.928783 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d274afd_6ab1_4652_8093_c3941b617a98.slice/crio-e4948389ec1d4ef331b24cc78e5aacf76dc8948fdc8a5eeb79321e2d031f1640 WatchSource:0}: Error finding container e4948389ec1d4ef331b24cc78e5aacf76dc8948fdc8a5eeb79321e2d031f1640: Status 404 returned error can't find the container with id e4948389ec1d4ef331b24cc78e5aacf76dc8948fdc8a5eeb79321e2d031f1640 Mar 12 17:10:29 crc kubenswrapper[5184]: I0312 17:10:29.013485 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4d274afd-6ab1-4652-8093-c3941b617a98","Type":"ContainerStarted","Data":"e4948389ec1d4ef331b24cc78e5aacf76dc8948fdc8a5eeb79321e2d031f1640"} Mar 12 17:10:30 crc kubenswrapper[5184]: I0312 17:10:30.025735 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4d274afd-6ab1-4652-8093-c3941b617a98","Type":"ContainerStarted","Data":"9a3f57e9ced7e11f118d449e732fc9b29ecb787878104ce28dfaf8acab8300d9"} Mar 12 17:10:30 crc kubenswrapper[5184]: I0312 17:10:30.026208 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:30 crc kubenswrapper[5184]: I0312 17:10:30.046346 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.046324089 podStartE2EDuration="2.046324089s" podCreationTimestamp="2026-03-12 17:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:30.04088185 +0000 UTC m=+1172.582193199" watchObservedRunningTime="2026-03-12 17:10:30.046324089 +0000 UTC m=+1172.587635428" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.039779 5184 generic.go:358] "Generic (PLEG): container finished" podID="a419b263-d0db-40f1-a862-027595731917" containerID="161634025e2e294517402973a94374132631ce713a732744d7cdb3fae6ae8772" exitCode=0 Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.039955 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a419b263-d0db-40f1-a862-027595731917","Type":"ContainerDied","Data":"161634025e2e294517402973a94374132631ce713a732744d7cdb3fae6ae8772"} Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.223311 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.329148 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-run-httpd\") pod \"a419b263-d0db-40f1-a862-027595731917\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.329284 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxhpz\" (UniqueName: \"kubernetes.io/projected/a419b263-d0db-40f1-a862-027595731917-kube-api-access-rxhpz\") pod \"a419b263-d0db-40f1-a862-027595731917\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.329333 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-combined-ca-bundle\") pod \"a419b263-d0db-40f1-a862-027595731917\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.329398 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-log-httpd\") pod \"a419b263-d0db-40f1-a862-027595731917\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.329555 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-scripts\") pod \"a419b263-d0db-40f1-a862-027595731917\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.329618 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-sg-core-conf-yaml\") pod \"a419b263-d0db-40f1-a862-027595731917\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.329657 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a419b263-d0db-40f1-a862-027595731917" (UID: "a419b263-d0db-40f1-a862-027595731917"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.329739 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-config-data\") pod \"a419b263-d0db-40f1-a862-027595731917\" (UID: \"a419b263-d0db-40f1-a862-027595731917\") " Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.330478 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a419b263-d0db-40f1-a862-027595731917" (UID: "a419b263-d0db-40f1-a862-027595731917"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.331268 5184 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.331306 5184 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a419b263-d0db-40f1-a862-027595731917-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.335700 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a419b263-d0db-40f1-a862-027595731917-kube-api-access-rxhpz" (OuterVolumeSpecName: "kube-api-access-rxhpz") pod "a419b263-d0db-40f1-a862-027595731917" (UID: "a419b263-d0db-40f1-a862-027595731917"). InnerVolumeSpecName "kube-api-access-rxhpz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.342458 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-scripts" (OuterVolumeSpecName: "scripts") pod "a419b263-d0db-40f1-a862-027595731917" (UID: "a419b263-d0db-40f1-a862-027595731917"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.358035 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a419b263-d0db-40f1-a862-027595731917" (UID: "a419b263-d0db-40f1-a862-027595731917"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.432859 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rxhpz\" (UniqueName: \"kubernetes.io/projected/a419b263-d0db-40f1-a862-027595731917-kube-api-access-rxhpz\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.432893 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.432907 5184 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.445027 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-config-data" (OuterVolumeSpecName: "config-data") pod "a419b263-d0db-40f1-a862-027595731917" (UID: "a419b263-d0db-40f1-a862-027595731917"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.446574 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a419b263-d0db-40f1-a862-027595731917" (UID: "a419b263-d0db-40f1-a862-027595731917"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.535614 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:31 crc kubenswrapper[5184]: I0312 17:10:31.536074 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a419b263-d0db-40f1-a862-027595731917-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.055942 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.055933 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a419b263-d0db-40f1-a862-027595731917","Type":"ContainerDied","Data":"5af9540be83eafa63a843c72682ba49d381a70c011fa1af740b019dd834a0220"} Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.056310 5184 scope.go:117] "RemoveContainer" containerID="096f5c84117191047fd06166c50e8b3dc3d0b9de76314786c76c79f524c452f1" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.088012 5184 scope.go:117] "RemoveContainer" containerID="4da79caa3365e8991748944548b55986abb71f9127a45240530e849efd2cc26a" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.128345 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.140829 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.143834 5184 scope.go:117] "RemoveContainer" containerID="98e2c6a350e30fed727d4b1decdc981377842856f1d2d3b532db5bf316fb7213" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.148270 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.149981 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="ceilometer-notification-agent" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150020 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="ceilometer-notification-agent" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150080 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="sg-core" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150093 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="sg-core" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150135 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="ceilometer-central-agent" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150148 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="ceilometer-central-agent" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150168 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="proxy-httpd" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150178 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="proxy-httpd" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150506 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="sg-core" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150544 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="ceilometer-notification-agent" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150568 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="ceilometer-central-agent" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.150581 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a419b263-d0db-40f1-a862-027595731917" containerName="proxy-httpd" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.167979 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.168112 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.170958 5184 scope.go:117] "RemoveContainer" containerID="161634025e2e294517402973a94374132631ce713a732744d7cdb3fae6ae8772" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.182062 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.183474 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.247622 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-scripts\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.247755 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.247861 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlfvg\" (UniqueName: \"kubernetes.io/projected/12ba21e1-9b66-4713-9374-97ec0c9dd749-kube-api-access-xlfvg\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.247895 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-config-data\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.247968 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.248021 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-log-httpd\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.248194 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-run-httpd\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.351159 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-scripts\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.351345 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.351436 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlfvg\" (UniqueName: \"kubernetes.io/projected/12ba21e1-9b66-4713-9374-97ec0c9dd749-kube-api-access-xlfvg\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.351482 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-config-data\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.351571 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.351815 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-log-httpd\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.352082 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-run-httpd\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.352880 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-log-httpd\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.353059 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-run-httpd\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.357480 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.359333 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.373006 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-scripts\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.378172 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-config-data\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.379315 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlfvg\" (UniqueName: \"kubernetes.io/projected/12ba21e1-9b66-4713-9374-97ec0c9dd749-kube-api-access-xlfvg\") pod \"ceilometer-0\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.410955 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a419b263-d0db-40f1-a862-027595731917" path="/var/lib/kubelet/pods/a419b263-d0db-40f1-a862-027595731917/volumes" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.512530 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:10:32 crc kubenswrapper[5184]: I0312 17:10:32.985913 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:10:33 crc kubenswrapper[5184]: W0312 17:10:33.000025 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12ba21e1_9b66_4713_9374_97ec0c9dd749.slice/crio-29386866818a11ade90cd289c89106ea472473d8b2b137159a780ed4181e27b1 WatchSource:0}: Error finding container 29386866818a11ade90cd289c89106ea472473d8b2b137159a780ed4181e27b1: Status 404 returned error can't find the container with id 29386866818a11ade90cd289c89106ea472473d8b2b137159a780ed4181e27b1 Mar 12 17:10:33 crc kubenswrapper[5184]: I0312 17:10:33.068055 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ba21e1-9b66-4713-9374-97ec0c9dd749","Type":"ContainerStarted","Data":"29386866818a11ade90cd289c89106ea472473d8b2b137159a780ed4181e27b1"} Mar 12 17:10:34 crc kubenswrapper[5184]: I0312 17:10:34.080326 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ba21e1-9b66-4713-9374-97ec0c9dd749","Type":"ContainerStarted","Data":"627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787"} Mar 12 17:10:35 crc kubenswrapper[5184]: I0312 17:10:35.092513 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ba21e1-9b66-4713-9374-97ec0c9dd749","Type":"ContainerStarted","Data":"34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56"} Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.096516 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.609262 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-z94sz"] Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.616628 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.620207 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-manage-config-data\"" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.631544 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell0-manage-scripts\"" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.656831 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-z94sz"] Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.740013 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf6td\" (UniqueName: \"kubernetes.io/projected/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-kube-api-access-hf6td\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.740144 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-scripts\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.740239 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-config-data\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.740400 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.771273 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.787539 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.791818 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.813095 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.833593 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.844353 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-scripts\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.844431 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-config-data\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.844488 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.844573 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hf6td\" (UniqueName: \"kubernetes.io/projected/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-kube-api-access-hf6td\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.857229 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.857911 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-config-data\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.861627 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.869126 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.875969 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-scripts\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.890974 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.909090 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.931250 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf6td\" (UniqueName: \"kubernetes.io/projected/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-kube-api-access-hf6td\") pod \"nova-cell0-cell-mapping-z94sz\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.949572 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.949705 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.951922 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-novncproxy-config-data\"" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.956650 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.957638 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-config-data\") pod \"nova-scheduler-0\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.957744 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mb7j\" (UniqueName: \"kubernetes.io/projected/d09e0a6f-89c0-466f-93e6-3831659f0613-kube-api-access-7mb7j\") pod \"nova-scheduler-0\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:36 crc kubenswrapper[5184]: I0312 17:10:36.957784 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.025226 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8458c54c8c-8c75q"] Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.065926 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.065969 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-config-data\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.065994 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zfml\" (UniqueName: \"kubernetes.io/projected/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-kube-api-access-5zfml\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.066025 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-logs\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.066047 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.066082 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-config-data\") pod \"nova-scheduler-0\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.066108 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgf8j\" (UniqueName: \"kubernetes.io/projected/02994126-30bf-4b42-be17-a1fdb7ad571a-kube-api-access-zgf8j\") pod \"nova-cell1-novncproxy-0\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.066174 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mb7j\" (UniqueName: \"kubernetes.io/projected/d09e0a6f-89c0-466f-93e6-3831659f0613-kube-api-access-7mb7j\") pod \"nova-scheduler-0\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.066193 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.066222 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.075852 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8458c54c8c-8c75q"] Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.076203 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.084910 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-config-data\") pod \"nova-scheduler-0\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.091209 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.104292 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.112938 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mb7j\" (UniqueName: \"kubernetes.io/projected/d09e0a6f-89c0-466f-93e6-3831659f0613-kube-api-access-7mb7j\") pod \"nova-scheduler-0\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.113285 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.167600 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.167640 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-config-data\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.167660 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zfml\" (UniqueName: \"kubernetes.io/projected/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-kube-api-access-5zfml\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.167694 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-logs\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.167717 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.167760 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgf8j\" (UniqueName: \"kubernetes.io/projected/02994126-30bf-4b42-be17-a1fdb7ad571a-kube-api-access-zgf8j\") pod \"nova-cell1-novncproxy-0\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.167827 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.178239 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-logs\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.178400 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.188278 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.196114 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.215960 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-config-data\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.220277 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgf8j\" (UniqueName: \"kubernetes.io/projected/02994126-30bf-4b42-be17-a1fdb7ad571a-kube-api-access-zgf8j\") pod \"nova-cell1-novncproxy-0\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.223601 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zfml\" (UniqueName: \"kubernetes.io/projected/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-kube-api-access-5zfml\") pod \"nova-metadata-0\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.258697 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ba21e1-9b66-4713-9374-97ec0c9dd749","Type":"ContainerStarted","Data":"2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4"} Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.259059 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.258843 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.263801 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.271108 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-config\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.271207 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntccv\" (UniqueName: \"kubernetes.io/projected/051f761e-6e70-40a2-a1ac-55d668527483-kube-api-access-ntccv\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.271281 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-sb\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.271938 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-svc\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.271962 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-nb\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.272066 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-swift-storage-0\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.279791 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.296763 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.373622 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-sb\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.373700 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmg7r\" (UniqueName: \"kubernetes.io/projected/63387085-af7f-404d-bfce-0df2471fbad4-kube-api-access-dmg7r\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.373736 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.373805 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63387085-af7f-404d-bfce-0df2471fbad4-logs\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.373849 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-svc\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.373870 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-nb\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.373933 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-swift-storage-0\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.373955 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-config-data\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.373996 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-config\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.374037 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntccv\" (UniqueName: \"kubernetes.io/projected/051f761e-6e70-40a2-a1ac-55d668527483-kube-api-access-ntccv\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.375254 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-sb\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.376117 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-swift-storage-0\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.376363 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-svc\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.376595 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-config\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.376761 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-nb\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.395527 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntccv\" (UniqueName: \"kubernetes.io/projected/051f761e-6e70-40a2-a1ac-55d668527483-kube-api-access-ntccv\") pod \"dnsmasq-dns-8458c54c8c-8c75q\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.475626 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmg7r\" (UniqueName: \"kubernetes.io/projected/63387085-af7f-404d-bfce-0df2471fbad4-kube-api-access-dmg7r\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.475680 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.475718 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63387085-af7f-404d-bfce-0df2471fbad4-logs\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.475811 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-config-data\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.487799 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63387085-af7f-404d-bfce-0df2471fbad4-logs\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.513420 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.515822 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-config-data\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.517634 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmg7r\" (UniqueName: \"kubernetes.io/projected/63387085-af7f-404d-bfce-0df2471fbad4-kube-api-access-dmg7r\") pod \"nova-api-0\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.557669 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.581387 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.639993 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-z94sz"] Mar 12 17:10:37 crc kubenswrapper[5184]: W0312 17:10:37.665848 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9e6df0a_f516_4f34_bd84_d32cea82a0ed.slice/crio-3aa05f6f94c43966b89e44cd00dea8636ddc5a7906ee450b887c67acddd68b20 WatchSource:0}: Error finding container 3aa05f6f94c43966b89e44cd00dea8636ddc5a7906ee450b887c67acddd68b20: Status 404 returned error can't find the container with id 3aa05f6f94c43966b89e44cd00dea8636ddc5a7906ee450b887c67acddd68b20 Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.688653 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:10:37 crc kubenswrapper[5184]: W0312 17:10:37.689507 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd09e0a6f_89c0_466f_93e6_3831659f0613.slice/crio-aadd7a5c96a04e79659298f42da90d211813790206986ac7c030f700ab505ee5 WatchSource:0}: Error finding container aadd7a5c96a04e79659298f42da90d211813790206986ac7c030f700ab505ee5: Status 404 returned error can't find the container with id aadd7a5c96a04e79659298f42da90d211813790206986ac7c030f700ab505ee5 Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.880309 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zgwz8"] Mar 12 17:10:37 crc kubenswrapper[5184]: W0312 17:10:37.882666 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19af59d7_befe_45e1_8ab0_33ed5b5ddbf7.slice/crio-28437eb7d8ca70a8760e230a673cd897f232259756be6583f4e654d2ab47ff01 WatchSource:0}: Error finding container 28437eb7d8ca70a8760e230a673cd897f232259756be6583f4e654d2ab47ff01: Status 404 returned error can't find the container with id 28437eb7d8ca70a8760e230a673cd897f232259756be6583f4e654d2ab47ff01 Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.900886 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.901082 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.905088 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zgwz8"] Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.906756 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-scripts\"" Mar 12 17:10:37 crc kubenswrapper[5184]: I0312 17:10:37.906933 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-config-data\"" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.000751 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.000798 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-scripts\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.000852 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sscwv\" (UniqueName: \"kubernetes.io/projected/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-kube-api-access-sscwv\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.000879 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-config-data\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.008403 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.102771 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sscwv\" (UniqueName: \"kubernetes.io/projected/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-kube-api-access-sscwv\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.102829 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-config-data\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.102957 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.102975 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-scripts\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.109113 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-config-data\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.122347 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-scripts\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.131032 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.134041 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sscwv\" (UniqueName: \"kubernetes.io/projected/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-kube-api-access-sscwv\") pod \"nova-cell1-conductor-db-sync-zgwz8\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.196292 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-z94sz" event={"ID":"c9e6df0a-f516-4f34-bd84-d32cea82a0ed","Type":"ContainerStarted","Data":"6171b7e72d22708f031e33283a02382b94a60513db25d5c575c7fe34ea59191d"} Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.196363 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-z94sz" event={"ID":"c9e6df0a-f516-4f34-bd84-d32cea82a0ed","Type":"ContainerStarted","Data":"3aa05f6f94c43966b89e44cd00dea8636ddc5a7906ee450b887c67acddd68b20"} Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.197822 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8458c54c8c-8c75q"] Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.206232 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d09e0a6f-89c0-466f-93e6-3831659f0613","Type":"ContainerStarted","Data":"aadd7a5c96a04e79659298f42da90d211813790206986ac7c030f700ab505ee5"} Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.208213 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.218361 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"02994126-30bf-4b42-be17-a1fdb7ad571a","Type":"ContainerStarted","Data":"6cc88a833ecfc359cd015328b4d417715e56bc705b2944894a1c9c95e1c37d36"} Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.219123 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-z94sz" podStartSLOduration=2.219111574 podStartE2EDuration="2.219111574s" podCreationTimestamp="2026-03-12 17:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:38.218158094 +0000 UTC m=+1180.759469433" watchObservedRunningTime="2026-03-12 17:10:38.219111574 +0000 UTC m=+1180.760422913" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.228076 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7","Type":"ContainerStarted","Data":"28437eb7d8ca70a8760e230a673cd897f232259756be6583f4e654d2ab47ff01"} Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.231234 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:38 crc kubenswrapper[5184]: I0312 17:10:38.784434 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zgwz8"] Mar 12 17:10:38 crc kubenswrapper[5184]: W0312 17:10:38.797123 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c57d5a_7431_4ea7_b69b_f5f7ee50b3ad.slice/crio-c5a65c290102bec4b2a2dc999af3ab6c293eb3be93a86a6b4c34af8158c14022 WatchSource:0}: Error finding container c5a65c290102bec4b2a2dc999af3ab6c293eb3be93a86a6b4c34af8158c14022: Status 404 returned error can't find the container with id c5a65c290102bec4b2a2dc999af3ab6c293eb3be93a86a6b4c34af8158c14022 Mar 12 17:10:39 crc kubenswrapper[5184]: I0312 17:10:39.244028 5184 generic.go:358] "Generic (PLEG): container finished" podID="051f761e-6e70-40a2-a1ac-55d668527483" containerID="5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268" exitCode=0 Mar 12 17:10:39 crc kubenswrapper[5184]: I0312 17:10:39.244153 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" event={"ID":"051f761e-6e70-40a2-a1ac-55d668527483","Type":"ContainerDied","Data":"5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268"} Mar 12 17:10:39 crc kubenswrapper[5184]: I0312 17:10:39.249485 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" event={"ID":"051f761e-6e70-40a2-a1ac-55d668527483","Type":"ContainerStarted","Data":"ad7cdfa7a667d161e6487aac140d0b0bb662df15d7aa1b4d187d47a457b03608"} Mar 12 17:10:39 crc kubenswrapper[5184]: I0312 17:10:39.251268 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zgwz8" event={"ID":"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad","Type":"ContainerStarted","Data":"c5a65c290102bec4b2a2dc999af3ab6c293eb3be93a86a6b4c34af8158c14022"} Mar 12 17:10:39 crc kubenswrapper[5184]: I0312 17:10:39.252802 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63387085-af7f-404d-bfce-0df2471fbad4","Type":"ContainerStarted","Data":"4b12df73144d58187505415d2d7eb0b38621b6d974d9576b86848f55fe71ea3a"} Mar 12 17:10:39 crc kubenswrapper[5184]: I0312 17:10:39.264222 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ba21e1-9b66-4713-9374-97ec0c9dd749","Type":"ContainerStarted","Data":"4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a"} Mar 12 17:10:39 crc kubenswrapper[5184]: I0312 17:10:39.265329 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Mar 12 17:10:39 crc kubenswrapper[5184]: I0312 17:10:39.289714 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.798609832 podStartE2EDuration="7.289694725s" podCreationTimestamp="2026-03-12 17:10:32 +0000 UTC" firstStartedPulling="2026-03-12 17:10:33.001692791 +0000 UTC m=+1175.543004130" lastFinishedPulling="2026-03-12 17:10:38.492777684 +0000 UTC m=+1181.034089023" observedRunningTime="2026-03-12 17:10:39.279345994 +0000 UTC m=+1181.820657333" watchObservedRunningTime="2026-03-12 17:10:39.289694725 +0000 UTC m=+1181.831006064" Mar 12 17:10:40 crc kubenswrapper[5184]: I0312 17:10:40.434867 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:40 crc kubenswrapper[5184]: I0312 17:10:40.453841 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 17:10:41 crc kubenswrapper[5184]: I0312 17:10:41.292544 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zgwz8" event={"ID":"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad","Type":"ContainerStarted","Data":"d436d59aca7fa024eb05f51de8407da845441cd17d53483239b41fef1e87b97a"} Mar 12 17:10:41 crc kubenswrapper[5184]: I0312 17:10:41.320398 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zgwz8" podStartSLOduration=4.320358722 podStartE2EDuration="4.320358722s" podCreationTimestamp="2026-03-12 17:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:41.317188714 +0000 UTC m=+1183.858500073" watchObservedRunningTime="2026-03-12 17:10:41.320358722 +0000 UTC m=+1183.861670071" Mar 12 17:10:42 crc kubenswrapper[5184]: I0312 17:10:42.305417 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"02994126-30bf-4b42-be17-a1fdb7ad571a","Type":"ContainerStarted","Data":"0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88"} Mar 12 17:10:42 crc kubenswrapper[5184]: I0312 17:10:42.305500 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="02994126-30bf-4b42-be17-a1fdb7ad571a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88" gracePeriod=30 Mar 12 17:10:42 crc kubenswrapper[5184]: I0312 17:10:42.325993 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7","Type":"ContainerStarted","Data":"1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d"} Mar 12 17:10:42 crc kubenswrapper[5184]: I0312 17:10:42.331350 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" event={"ID":"051f761e-6e70-40a2-a1ac-55d668527483","Type":"ContainerStarted","Data":"18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b"} Mar 12 17:10:42 crc kubenswrapper[5184]: I0312 17:10:42.331437 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:42 crc kubenswrapper[5184]: I0312 17:10:42.333729 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63387085-af7f-404d-bfce-0df2471fbad4","Type":"ContainerStarted","Data":"77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf"} Mar 12 17:10:42 crc kubenswrapper[5184]: I0312 17:10:42.339407 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d09e0a6f-89c0-466f-93e6-3831659f0613","Type":"ContainerStarted","Data":"6d27b4ccdee5ad0e4b967ae1ea93f0b007f25aa89ae4261b4e5c7f01be9ed54f"} Mar 12 17:10:42 crc kubenswrapper[5184]: I0312 17:10:42.346307 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.837157992 podStartE2EDuration="6.346285044s" podCreationTimestamp="2026-03-12 17:10:36 +0000 UTC" firstStartedPulling="2026-03-12 17:10:38.019508607 +0000 UTC m=+1180.560819946" lastFinishedPulling="2026-03-12 17:10:41.528635619 +0000 UTC m=+1184.069946998" observedRunningTime="2026-03-12 17:10:42.338811412 +0000 UTC m=+1184.880122751" watchObservedRunningTime="2026-03-12 17:10:42.346285044 +0000 UTC m=+1184.887596373" Mar 12 17:10:42 crc kubenswrapper[5184]: I0312 17:10:42.373792 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" podStartSLOduration=6.37377216 podStartE2EDuration="6.37377216s" podCreationTimestamp="2026-03-12 17:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:42.366899276 +0000 UTC m=+1184.908210615" watchObservedRunningTime="2026-03-12 17:10:42.37377216 +0000 UTC m=+1184.915083499" Mar 12 17:10:42 crc kubenswrapper[5184]: I0312 17:10:42.392796 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.570776381 podStartE2EDuration="6.392774391s" podCreationTimestamp="2026-03-12 17:10:36 +0000 UTC" firstStartedPulling="2026-03-12 17:10:37.706830914 +0000 UTC m=+1180.248142253" lastFinishedPulling="2026-03-12 17:10:41.528828924 +0000 UTC m=+1184.070140263" observedRunningTime="2026-03-12 17:10:42.385925417 +0000 UTC m=+1184.927236756" watchObservedRunningTime="2026-03-12 17:10:42.392774391 +0000 UTC m=+1184.934085730" Mar 12 17:10:43 crc kubenswrapper[5184]: I0312 17:10:43.352828 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7","Type":"ContainerStarted","Data":"404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f"} Mar 12 17:10:43 crc kubenswrapper[5184]: I0312 17:10:43.352942 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" containerName="nova-metadata-log" containerID="cri-o://1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d" gracePeriod=30 Mar 12 17:10:43 crc kubenswrapper[5184]: I0312 17:10:43.353116 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" containerName="nova-metadata-metadata" containerID="cri-o://404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f" gracePeriod=30 Mar 12 17:10:43 crc kubenswrapper[5184]: I0312 17:10:43.356810 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63387085-af7f-404d-bfce-0df2471fbad4","Type":"ContainerStarted","Data":"3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c"} Mar 12 17:10:43 crc kubenswrapper[5184]: I0312 17:10:43.380968 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.737532152 podStartE2EDuration="7.380945849s" podCreationTimestamp="2026-03-12 17:10:36 +0000 UTC" firstStartedPulling="2026-03-12 17:10:37.885134509 +0000 UTC m=+1180.426445848" lastFinishedPulling="2026-03-12 17:10:41.528548216 +0000 UTC m=+1184.069859545" observedRunningTime="2026-03-12 17:10:43.377784771 +0000 UTC m=+1185.919096120" watchObservedRunningTime="2026-03-12 17:10:43.380945849 +0000 UTC m=+1185.922257188" Mar 12 17:10:43 crc kubenswrapper[5184]: I0312 17:10:43.407171 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.08183719 podStartE2EDuration="6.407150385s" podCreationTimestamp="2026-03-12 17:10:37 +0000 UTC" firstStartedPulling="2026-03-12 17:10:38.213055996 +0000 UTC m=+1180.754367335" lastFinishedPulling="2026-03-12 17:10:41.538369151 +0000 UTC m=+1184.079680530" observedRunningTime="2026-03-12 17:10:43.401589752 +0000 UTC m=+1185.942901101" watchObservedRunningTime="2026-03-12 17:10:43.407150385 +0000 UTC m=+1185.948461734" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.002072 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.141238 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-combined-ca-bundle\") pod \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.141513 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-config-data\") pod \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.141605 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zfml\" (UniqueName: \"kubernetes.io/projected/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-kube-api-access-5zfml\") pod \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.141636 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-logs\") pod \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\" (UID: \"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7\") " Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.142459 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-logs" (OuterVolumeSpecName: "logs") pod "19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" (UID: "19af59d7-befe-45e1-8ab0-33ed5b5ddbf7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.147298 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-kube-api-access-5zfml" (OuterVolumeSpecName: "kube-api-access-5zfml") pod "19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" (UID: "19af59d7-befe-45e1-8ab0-33ed5b5ddbf7"). InnerVolumeSpecName "kube-api-access-5zfml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.171396 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" (UID: "19af59d7-befe-45e1-8ab0-33ed5b5ddbf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.174002 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-config-data" (OuterVolumeSpecName: "config-data") pod "19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" (UID: "19af59d7-befe-45e1-8ab0-33ed5b5ddbf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.243797 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5zfml\" (UniqueName: \"kubernetes.io/projected/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-kube-api-access-5zfml\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.244192 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.244211 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.244224 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.372101 5184 generic.go:358] "Generic (PLEG): container finished" podID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" containerID="404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f" exitCode=0 Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.372165 5184 generic.go:358] "Generic (PLEG): container finished" podID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" containerID="1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d" exitCode=143 Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.372173 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7","Type":"ContainerDied","Data":"404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f"} Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.372214 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7","Type":"ContainerDied","Data":"1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d"} Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.372227 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19af59d7-befe-45e1-8ab0-33ed5b5ddbf7","Type":"ContainerDied","Data":"28437eb7d8ca70a8760e230a673cd897f232259756be6583f4e654d2ab47ff01"} Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.372242 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.372246 5184 scope.go:117] "RemoveContainer" containerID="404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.400633 5184 scope.go:117] "RemoveContainer" containerID="1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.415836 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.424449 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.453640 5184 scope.go:117] "RemoveContainer" containerID="404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f" Mar 12 17:10:44 crc kubenswrapper[5184]: E0312 17:10:44.454682 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f\": container with ID starting with 404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f not found: ID does not exist" containerID="404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.454810 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f"} err="failed to get container status \"404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f\": rpc error: code = NotFound desc = could not find container \"404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f\": container with ID starting with 404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f not found: ID does not exist" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.454919 5184 scope.go:117] "RemoveContainer" containerID="1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.458483 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.459509 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" containerName="nova-metadata-metadata" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.459622 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" containerName="nova-metadata-metadata" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.459713 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" containerName="nova-metadata-log" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.459783 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" containerName="nova-metadata-log" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.460089 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" containerName="nova-metadata-log" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.460219 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" containerName="nova-metadata-metadata" Mar 12 17:10:44 crc kubenswrapper[5184]: E0312 17:10:44.459642 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d\": container with ID starting with 1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d not found: ID does not exist" containerID="1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.460981 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d"} err="failed to get container status \"1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d\": rpc error: code = NotFound desc = could not find container \"1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d\": container with ID starting with 1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d not found: ID does not exist" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.461030 5184 scope.go:117] "RemoveContainer" containerID="404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.462036 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f"} err="failed to get container status \"404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f\": rpc error: code = NotFound desc = could not find container \"404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f\": container with ID starting with 404825ff6b8f4a0a1d6b217c8821e1a55051042e97b9e6c92ca887abc8d6df8f not found: ID does not exist" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.462069 5184 scope.go:117] "RemoveContainer" containerID="1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.462332 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d"} err="failed to get container status \"1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d\": rpc error: code = NotFound desc = could not find container \"1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d\": container with ID starting with 1b6c6f63b276a1ea3fbc4e983f185aaaa01a334b7d47508d0c0450fa8255b63d not found: ID does not exist" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.467409 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.467658 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.469969 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-metadata-internal-svc\"" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.470263 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.549935 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-config-data\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.550339 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-logs\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.550572 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.550739 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.550872 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5v5p\" (UniqueName: \"kubernetes.io/projected/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-kube-api-access-m5v5p\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.652965 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-logs\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.653293 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.653437 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.653570 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5v5p\" (UniqueName: \"kubernetes.io/projected/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-kube-api-access-m5v5p\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.653694 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-config-data\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.654811 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-logs\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.658824 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-config-data\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.659228 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.672592 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.674788 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5v5p\" (UniqueName: \"kubernetes.io/projected/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-kube-api-access-m5v5p\") pod \"nova-metadata-0\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " pod="openstack/nova-metadata-0" Mar 12 17:10:44 crc kubenswrapper[5184]: I0312 17:10:44.828654 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:10:45 crc kubenswrapper[5184]: I0312 17:10:45.330974 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:45 crc kubenswrapper[5184]: I0312 17:10:45.384066 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58669f7c-7cd8-4628-9a94-ef53ed4f6a29","Type":"ContainerStarted","Data":"d9b2c9f101deb10a8fb834dfb0c68a7ffd6a4b4d839afb6228fe90c59b289d37"} Mar 12 17:10:45 crc kubenswrapper[5184]: I0312 17:10:45.385643 5184 generic.go:358] "Generic (PLEG): container finished" podID="c9e6df0a-f516-4f34-bd84-d32cea82a0ed" containerID="6171b7e72d22708f031e33283a02382b94a60513db25d5c575c7fe34ea59191d" exitCode=0 Mar 12 17:10:45 crc kubenswrapper[5184]: I0312 17:10:45.385688 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-z94sz" event={"ID":"c9e6df0a-f516-4f34-bd84-d32cea82a0ed","Type":"ContainerDied","Data":"6171b7e72d22708f031e33283a02382b94a60513db25d5c575c7fe34ea59191d"} Mar 12 17:10:46 crc kubenswrapper[5184]: I0312 17:10:46.433042 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19af59d7-befe-45e1-8ab0-33ed5b5ddbf7" path="/var/lib/kubelet/pods/19af59d7-befe-45e1-8ab0-33ed5b5ddbf7/volumes" Mar 12 17:10:46 crc kubenswrapper[5184]: I0312 17:10:46.434656 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58669f7c-7cd8-4628-9a94-ef53ed4f6a29","Type":"ContainerStarted","Data":"a1a16f59705d0daea0f9330fa98494b06efe61b61d387c52c5cdde8b09e7e681"} Mar 12 17:10:46 crc kubenswrapper[5184]: I0312 17:10:46.434706 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58669f7c-7cd8-4628-9a94-ef53ed4f6a29","Type":"ContainerStarted","Data":"296de82bba56fbcff533c1e296b8ef478c01cb3e656d6a83b8849404651914ce"} Mar 12 17:10:46 crc kubenswrapper[5184]: I0312 17:10:46.456246 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.456221829 podStartE2EDuration="2.456221829s" podCreationTimestamp="2026-03-12 17:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:46.440628825 +0000 UTC m=+1188.981940254" watchObservedRunningTime="2026-03-12 17:10:46.456221829 +0000 UTC m=+1188.997533158" Mar 12 17:10:46 crc kubenswrapper[5184]: I0312 17:10:46.851062 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:46 crc kubenswrapper[5184]: I0312 17:10:46.999716 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-config-data\") pod \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:46.999888 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-scripts\") pod \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.000169 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-combined-ca-bundle\") pod \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.000214 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf6td\" (UniqueName: \"kubernetes.io/projected/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-kube-api-access-hf6td\") pod \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\" (UID: \"c9e6df0a-f516-4f34-bd84-d32cea82a0ed\") " Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.005545 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-scripts" (OuterVolumeSpecName: "scripts") pod "c9e6df0a-f516-4f34-bd84-d32cea82a0ed" (UID: "c9e6df0a-f516-4f34-bd84-d32cea82a0ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.010762 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-kube-api-access-hf6td" (OuterVolumeSpecName: "kube-api-access-hf6td") pod "c9e6df0a-f516-4f34-bd84-d32cea82a0ed" (UID: "c9e6df0a-f516-4f34-bd84-d32cea82a0ed"). InnerVolumeSpecName "kube-api-access-hf6td". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.030728 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-config-data" (OuterVolumeSpecName: "config-data") pod "c9e6df0a-f516-4f34-bd84-d32cea82a0ed" (UID: "c9e6df0a-f516-4f34-bd84-d32cea82a0ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.040557 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9e6df0a-f516-4f34-bd84-d32cea82a0ed" (UID: "c9e6df0a-f516-4f34-bd84-d32cea82a0ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.103153 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.103238 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hf6td\" (UniqueName: \"kubernetes.io/projected/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-kube-api-access-hf6td\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.103259 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.103273 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e6df0a-f516-4f34-bd84-d32cea82a0ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.114494 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.114550 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.140518 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.297632 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.412775 5184 generic.go:358] "Generic (PLEG): container finished" podID="95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad" containerID="d436d59aca7fa024eb05f51de8407da845441cd17d53483239b41fef1e87b97a" exitCode=0 Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.412863 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zgwz8" event={"ID":"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad","Type":"ContainerDied","Data":"d436d59aca7fa024eb05f51de8407da845441cd17d53483239b41fef1e87b97a"} Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.415826 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-z94sz" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.415866 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-z94sz" event={"ID":"c9e6df0a-f516-4f34-bd84-d32cea82a0ed","Type":"ContainerDied","Data":"3aa05f6f94c43966b89e44cd00dea8636ddc5a7906ee450b887c67acddd68b20"} Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.415908 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aa05f6f94c43966b89e44cd00dea8636ddc5a7906ee450b887c67acddd68b20" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.478128 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.741053 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.741104 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.765237 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.796766 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:47 crc kubenswrapper[5184]: I0312 17:10:47.983174 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:10:48 crc kubenswrapper[5184]: I0312 17:10:48.358579 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:10:48 crc kubenswrapper[5184]: I0312 17:10:48.472666 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63387085-af7f-404d-bfce-0df2471fbad4" containerName="nova-api-api" containerID="cri-o://3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c" gracePeriod=30 Mar 12 17:10:48 crc kubenswrapper[5184]: I0312 17:10:48.472707 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" containerName="nova-metadata-log" containerID="cri-o://296de82bba56fbcff533c1e296b8ef478c01cb3e656d6a83b8849404651914ce" gracePeriod=30 Mar 12 17:10:48 crc kubenswrapper[5184]: I0312 17:10:48.472747 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63387085-af7f-404d-bfce-0df2471fbad4" containerName="nova-api-log" containerID="cri-o://77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf" gracePeriod=30 Mar 12 17:10:48 crc kubenswrapper[5184]: I0312 17:10:48.472807 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" containerName="nova-metadata-metadata" containerID="cri-o://a1a16f59705d0daea0f9330fa98494b06efe61b61d387c52c5cdde8b09e7e681" gracePeriod=30 Mar 12 17:10:48 crc kubenswrapper[5184]: I0312 17:10:48.477118 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fc87f-mb797"] Mar 12 17:10:48 crc kubenswrapper[5184]: I0312 17:10:48.477866 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9fc87f-mb797" podUID="7d33c92d-847e-48b5-bb1b-a8defe0756f7" containerName="dnsmasq-dns" containerID="cri-o://6abf4815e6024da2d2e5511d3112e2b6c2bb747058927306f1928e25cfd11214" gracePeriod=10 Mar 12 17:10:48 crc kubenswrapper[5184]: I0312 17:10:48.484581 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63387085-af7f-404d-bfce-0df2471fbad4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": EOF" Mar 12 17:10:48 crc kubenswrapper[5184]: I0312 17:10:48.484546 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63387085-af7f-404d-bfce-0df2471fbad4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": EOF" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.171090 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.180349 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sscwv\" (UniqueName: \"kubernetes.io/projected/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-kube-api-access-sscwv\") pod \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.180413 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-config-data\") pod \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.180573 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-combined-ca-bundle\") pod \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.180789 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-scripts\") pod \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\" (UID: \"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.187839 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-kube-api-access-sscwv" (OuterVolumeSpecName: "kube-api-access-sscwv") pod "95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad" (UID: "95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad"). InnerVolumeSpecName "kube-api-access-sscwv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.189404 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-scripts" (OuterVolumeSpecName: "scripts") pod "95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad" (UID: "95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.215228 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad" (UID: "95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.240801 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-config-data" (OuterVolumeSpecName: "config-data") pod "95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad" (UID: "95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.284781 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.284827 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sscwv\" (UniqueName: \"kubernetes.io/projected/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-kube-api-access-sscwv\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.284851 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.284862 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.511838 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zgwz8" event={"ID":"95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad","Type":"ContainerDied","Data":"c5a65c290102bec4b2a2dc999af3ab6c293eb3be93a86a6b4c34af8158c14022"} Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.513243 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5a65c290102bec4b2a2dc999af3ab6c293eb3be93a86a6b4c34af8158c14022" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.512435 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zgwz8" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.525714 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.526712 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad" containerName="nova-cell1-conductor-db-sync" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.526728 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad" containerName="nova-cell1-conductor-db-sync" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.526760 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9e6df0a-f516-4f34-bd84-d32cea82a0ed" containerName="nova-manage" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.526765 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e6df0a-f516-4f34-bd84-d32cea82a0ed" containerName="nova-manage" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.526989 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9e6df0a-f516-4f34-bd84-d32cea82a0ed" containerName="nova-manage" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.527006 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad" containerName="nova-cell1-conductor-db-sync" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.533967 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.536856 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.538562 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-conductor-config-data\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.539298 5184 generic.go:358] "Generic (PLEG): container finished" podID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" containerID="a1a16f59705d0daea0f9330fa98494b06efe61b61d387c52c5cdde8b09e7e681" exitCode=0 Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.539312 5184 generic.go:358] "Generic (PLEG): container finished" podID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" containerID="296de82bba56fbcff533c1e296b8ef478c01cb3e656d6a83b8849404651914ce" exitCode=143 Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.539399 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58669f7c-7cd8-4628-9a94-ef53ed4f6a29","Type":"ContainerDied","Data":"a1a16f59705d0daea0f9330fa98494b06efe61b61d387c52c5cdde8b09e7e681"} Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.539418 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58669f7c-7cd8-4628-9a94-ef53ed4f6a29","Type":"ContainerDied","Data":"296de82bba56fbcff533c1e296b8ef478c01cb3e656d6a83b8849404651914ce"} Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.556573 5184 generic.go:358] "Generic (PLEG): container finished" podID="7d33c92d-847e-48b5-bb1b-a8defe0756f7" containerID="6abf4815e6024da2d2e5511d3112e2b6c2bb747058927306f1928e25cfd11214" exitCode=0 Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.557198 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc87f-mb797" event={"ID":"7d33c92d-847e-48b5-bb1b-a8defe0756f7","Type":"ContainerDied","Data":"6abf4815e6024da2d2e5511d3112e2b6c2bb747058927306f1928e25cfd11214"} Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.559629 5184 generic.go:358] "Generic (PLEG): container finished" podID="63387085-af7f-404d-bfce-0df2471fbad4" containerID="77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf" exitCode=143 Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.559711 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63387085-af7f-404d-bfce-0df2471fbad4","Type":"ContainerDied","Data":"77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf"} Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.560060 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d09e0a6f-89c0-466f-93e6-3831659f0613" containerName="nova-scheduler-scheduler" containerID="cri-o://6d27b4ccdee5ad0e4b967ae1ea93f0b007f25aa89ae4261b4e5c7f01be9ed54f" gracePeriod=30 Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.606631 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75247eae-d2ef-43a9-a5fc-66bc1f351feb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"75247eae-d2ef-43a9-a5fc-66bc1f351feb\") " pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.606717 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75247eae-d2ef-43a9-a5fc-66bc1f351feb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"75247eae-d2ef-43a9-a5fc-66bc1f351feb\") " pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.606815 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l5fc\" (UniqueName: \"kubernetes.io/projected/75247eae-d2ef-43a9-a5fc-66bc1f351feb-kube-api-access-6l5fc\") pod \"nova-cell1-conductor-0\" (UID: \"75247eae-d2ef-43a9-a5fc-66bc1f351feb\") " pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.695539 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.708191 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l5fc\" (UniqueName: \"kubernetes.io/projected/75247eae-d2ef-43a9-a5fc-66bc1f351feb-kube-api-access-6l5fc\") pod \"nova-cell1-conductor-0\" (UID: \"75247eae-d2ef-43a9-a5fc-66bc1f351feb\") " pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.708363 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75247eae-d2ef-43a9-a5fc-66bc1f351feb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"75247eae-d2ef-43a9-a5fc-66bc1f351feb\") " pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.708419 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75247eae-d2ef-43a9-a5fc-66bc1f351feb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"75247eae-d2ef-43a9-a5fc-66bc1f351feb\") " pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.712541 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.717842 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75247eae-d2ef-43a9-a5fc-66bc1f351feb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"75247eae-d2ef-43a9-a5fc-66bc1f351feb\") " pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.723099 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75247eae-d2ef-43a9-a5fc-66bc1f351feb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"75247eae-d2ef-43a9-a5fc-66bc1f351feb\") " pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.726425 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l5fc\" (UniqueName: \"kubernetes.io/projected/75247eae-d2ef-43a9-a5fc-66bc1f351feb-kube-api-access-6l5fc\") pod \"nova-cell1-conductor-0\" (UID: \"75247eae-d2ef-43a9-a5fc-66bc1f351feb\") " pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.809987 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-combined-ca-bundle\") pod \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.810070 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-nova-metadata-tls-certs\") pod \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.810180 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnt7x\" (UniqueName: \"kubernetes.io/projected/7d33c92d-847e-48b5-bb1b-a8defe0756f7-kube-api-access-bnt7x\") pod \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.810248 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-nb\") pod \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.810395 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-sb\") pod \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.810548 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-swift-storage-0\") pod \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.810602 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-config-data\") pod \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.810719 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-logs\") pod \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.810747 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5v5p\" (UniqueName: \"kubernetes.io/projected/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-kube-api-access-m5v5p\") pod \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\" (UID: \"58669f7c-7cd8-4628-9a94-ef53ed4f6a29\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.810772 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-svc\") pod \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.810795 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-config\") pod \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\" (UID: \"7d33c92d-847e-48b5-bb1b-a8defe0756f7\") " Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.811394 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-logs" (OuterVolumeSpecName: "logs") pod "58669f7c-7cd8-4628-9a94-ef53ed4f6a29" (UID: "58669f7c-7cd8-4628-9a94-ef53ed4f6a29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.842588 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-kube-api-access-m5v5p" (OuterVolumeSpecName: "kube-api-access-m5v5p") pod "58669f7c-7cd8-4628-9a94-ef53ed4f6a29" (UID: "58669f7c-7cd8-4628-9a94-ef53ed4f6a29"). InnerVolumeSpecName "kube-api-access-m5v5p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.842756 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d33c92d-847e-48b5-bb1b-a8defe0756f7-kube-api-access-bnt7x" (OuterVolumeSpecName: "kube-api-access-bnt7x") pod "7d33c92d-847e-48b5-bb1b-a8defe0756f7" (UID: "7d33c92d-847e-48b5-bb1b-a8defe0756f7"). InnerVolumeSpecName "kube-api-access-bnt7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.875987 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.887463 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58669f7c-7cd8-4628-9a94-ef53ed4f6a29" (UID: "58669f7c-7cd8-4628-9a94-ef53ed4f6a29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.892820 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7d33c92d-847e-48b5-bb1b-a8defe0756f7" (UID: "7d33c92d-847e-48b5-bb1b-a8defe0756f7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.896135 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-config-data" (OuterVolumeSpecName: "config-data") pod "58669f7c-7cd8-4628-9a94-ef53ed4f6a29" (UID: "58669f7c-7cd8-4628-9a94-ef53ed4f6a29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.900011 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d33c92d-847e-48b5-bb1b-a8defe0756f7" (UID: "7d33c92d-847e-48b5-bb1b-a8defe0756f7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.913000 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.913026 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bnt7x\" (UniqueName: \"kubernetes.io/projected/7d33c92d-847e-48b5-bb1b-a8defe0756f7-kube-api-access-bnt7x\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.913036 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.913046 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.913055 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.913063 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5v5p\" (UniqueName: \"kubernetes.io/projected/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-kube-api-access-m5v5p\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.913072 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.913214 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "58669f7c-7cd8-4628-9a94-ef53ed4f6a29" (UID: "58669f7c-7cd8-4628-9a94-ef53ed4f6a29"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.934815 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7d33c92d-847e-48b5-bb1b-a8defe0756f7" (UID: "7d33c92d-847e-48b5-bb1b-a8defe0756f7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.935650 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7d33c92d-847e-48b5-bb1b-a8defe0756f7" (UID: "7d33c92d-847e-48b5-bb1b-a8defe0756f7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:10:49 crc kubenswrapper[5184]: I0312 17:10:49.964848 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-config" (OuterVolumeSpecName: "config") pod "7d33c92d-847e-48b5-bb1b-a8defe0756f7" (UID: "7d33c92d-847e-48b5-bb1b-a8defe0756f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.015954 5184 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.015989 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.016000 5184 reconciler_common.go:299] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/58669f7c-7cd8-4628-9a94-ef53ed4f6a29-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.016011 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d33c92d-847e-48b5-bb1b-a8defe0756f7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:50 crc kubenswrapper[5184]: E0312 17:10:50.128228 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1792214 actualBytes=10240 Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.372119 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 17:10:50 crc kubenswrapper[5184]: W0312 17:10:50.382124 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75247eae_d2ef_43a9_a5fc_66bc1f351feb.slice/crio-91d15752205b9649f6b3cb68b28cc38acdad8d2927e6a286e45c40e5312e0401 WatchSource:0}: Error finding container 91d15752205b9649f6b3cb68b28cc38acdad8d2927e6a286e45c40e5312e0401: Status 404 returned error can't find the container with id 91d15752205b9649f6b3cb68b28cc38acdad8d2927e6a286e45c40e5312e0401 Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.591939 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc87f-mb797" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.591988 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc87f-mb797" event={"ID":"7d33c92d-847e-48b5-bb1b-a8defe0756f7","Type":"ContainerDied","Data":"7f3cfe57d238fc767c2be16e2b9f20a80cb15cd0418b8f26048d3fb0b2127c52"} Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.592502 5184 scope.go:117] "RemoveContainer" containerID="6abf4815e6024da2d2e5511d3112e2b6c2bb747058927306f1928e25cfd11214" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.595625 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"75247eae-d2ef-43a9-a5fc-66bc1f351feb","Type":"ContainerStarted","Data":"91d15752205b9649f6b3cb68b28cc38acdad8d2927e6a286e45c40e5312e0401"} Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.609967 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58669f7c-7cd8-4628-9a94-ef53ed4f6a29","Type":"ContainerDied","Data":"d9b2c9f101deb10a8fb834dfb0c68a7ffd6a4b4d839afb6228fe90c59b289d37"} Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.610004 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.721840 5184 scope.go:117] "RemoveContainer" containerID="26910bc0aabc5a2a8df0698e2a81e502134cee7215f2181eaa42086f4f0b5a54" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.741084 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.768433 5184 scope.go:117] "RemoveContainer" containerID="a1a16f59705d0daea0f9330fa98494b06efe61b61d387c52c5cdde8b09e7e681" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.770881 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.799003 5184 scope.go:117] "RemoveContainer" containerID="296de82bba56fbcff533c1e296b8ef478c01cb3e656d6a83b8849404651914ce" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.802305 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fc87f-mb797"] Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.818187 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820208 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" containerName="nova-metadata-metadata" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820242 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" containerName="nova-metadata-metadata" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820283 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" containerName="nova-metadata-log" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820294 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" containerName="nova-metadata-log" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820328 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d33c92d-847e-48b5-bb1b-a8defe0756f7" containerName="dnsmasq-dns" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820339 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d33c92d-847e-48b5-bb1b-a8defe0756f7" containerName="dnsmasq-dns" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820365 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d33c92d-847e-48b5-bb1b-a8defe0756f7" containerName="init" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820398 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d33c92d-847e-48b5-bb1b-a8defe0756f7" containerName="init" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820742 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" containerName="nova-metadata-log" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820782 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" containerName="nova-metadata-metadata" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.820799 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d33c92d-847e-48b5-bb1b-a8defe0756f7" containerName="dnsmasq-dns" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.831179 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9fc87f-mb797"] Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.831367 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.834048 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-metadata-internal-svc\"" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.834089 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.840000 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.938601 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-config-data\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.938770 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrlw\" (UniqueName: \"kubernetes.io/projected/3a0286e4-4295-4526-bcf3-f003b4766dec-kube-api-access-wqrlw\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.938804 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.938936 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:50 crc kubenswrapper[5184]: I0312 17:10:50.938972 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a0286e4-4295-4526-bcf3-f003b4766dec-logs\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.040089 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrlw\" (UniqueName: \"kubernetes.io/projected/3a0286e4-4295-4526-bcf3-f003b4766dec-kube-api-access-wqrlw\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.040460 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.040535 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.040556 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a0286e4-4295-4526-bcf3-f003b4766dec-logs\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.040605 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-config-data\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.041571 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a0286e4-4295-4526-bcf3-f003b4766dec-logs\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.048356 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.053550 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-config-data\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.058849 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrlw\" (UniqueName: \"kubernetes.io/projected/3a0286e4-4295-4526-bcf3-f003b4766dec-kube-api-access-wqrlw\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.064615 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.150477 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.638274 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"75247eae-d2ef-43a9-a5fc-66bc1f351feb","Type":"ContainerStarted","Data":"23dde2d19bdfbc37257ce44520a2bcd1ed4e3663a70c4620387079c075381180"} Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.643287 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:51 crc kubenswrapper[5184]: I0312 17:10:51.660482 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:10:52 crc kubenswrapper[5184]: I0312 17:10:52.412792 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58669f7c-7cd8-4628-9a94-ef53ed4f6a29" path="/var/lib/kubelet/pods/58669f7c-7cd8-4628-9a94-ef53ed4f6a29/volumes" Mar 12 17:10:52 crc kubenswrapper[5184]: I0312 17:10:52.413728 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d33c92d-847e-48b5-bb1b-a8defe0756f7" path="/var/lib/kubelet/pods/7d33c92d-847e-48b5-bb1b-a8defe0756f7/volumes" Mar 12 17:10:52 crc kubenswrapper[5184]: E0312 17:10:52.419130 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d27b4ccdee5ad0e4b967ae1ea93f0b007f25aa89ae4261b4e5c7f01be9ed54f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 17:10:52 crc kubenswrapper[5184]: E0312 17:10:52.421278 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d27b4ccdee5ad0e4b967ae1ea93f0b007f25aa89ae4261b4e5c7f01be9ed54f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 17:10:52 crc kubenswrapper[5184]: E0312 17:10:52.423247 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d27b4ccdee5ad0e4b967ae1ea93f0b007f25aa89ae4261b4e5c7f01be9ed54f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 17:10:52 crc kubenswrapper[5184]: E0312 17:10:52.423274 5184 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d09e0a6f-89c0-466f-93e6-3831659f0613" containerName="nova-scheduler-scheduler" probeResult="unknown" Mar 12 17:10:52 crc kubenswrapper[5184]: I0312 17:10:52.650224 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a0286e4-4295-4526-bcf3-f003b4766dec","Type":"ContainerStarted","Data":"137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680"} Mar 12 17:10:52 crc kubenswrapper[5184]: I0312 17:10:52.650274 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a0286e4-4295-4526-bcf3-f003b4766dec","Type":"ContainerStarted","Data":"1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e"} Mar 12 17:10:52 crc kubenswrapper[5184]: I0312 17:10:52.650290 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a0286e4-4295-4526-bcf3-f003b4766dec","Type":"ContainerStarted","Data":"35033e3783a6e00d5059ac920d67e51d4d35d43d624714707d26bd3287fc0254"} Mar 12 17:10:52 crc kubenswrapper[5184]: I0312 17:10:52.666992 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.666968852 podStartE2EDuration="3.666968852s" podCreationTimestamp="2026-03-12 17:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:51.680555278 +0000 UTC m=+1194.221866657" watchObservedRunningTime="2026-03-12 17:10:52.666968852 +0000 UTC m=+1195.208280211" Mar 12 17:10:52 crc kubenswrapper[5184]: I0312 17:10:52.668913 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6688992320000002 podStartE2EDuration="2.668899232s" podCreationTimestamp="2026-03-12 17:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:52.666819918 +0000 UTC m=+1195.208131277" watchObservedRunningTime="2026-03-12 17:10:52.668899232 +0000 UTC m=+1195.210210581" Mar 12 17:10:53 crc kubenswrapper[5184]: I0312 17:10:53.682242 5184 generic.go:358] "Generic (PLEG): container finished" podID="d09e0a6f-89c0-466f-93e6-3831659f0613" containerID="6d27b4ccdee5ad0e4b967ae1ea93f0b007f25aa89ae4261b4e5c7f01be9ed54f" exitCode=0 Mar 12 17:10:53 crc kubenswrapper[5184]: I0312 17:10:53.682308 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d09e0a6f-89c0-466f-93e6-3831659f0613","Type":"ContainerDied","Data":"6d27b4ccdee5ad0e4b967ae1ea93f0b007f25aa89ae4261b4e5c7f01be9ed54f"} Mar 12 17:10:53 crc kubenswrapper[5184]: I0312 17:10:53.853682 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.008080 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mb7j\" (UniqueName: \"kubernetes.io/projected/d09e0a6f-89c0-466f-93e6-3831659f0613-kube-api-access-7mb7j\") pod \"d09e0a6f-89c0-466f-93e6-3831659f0613\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.008334 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-combined-ca-bundle\") pod \"d09e0a6f-89c0-466f-93e6-3831659f0613\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.008498 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-config-data\") pod \"d09e0a6f-89c0-466f-93e6-3831659f0613\" (UID: \"d09e0a6f-89c0-466f-93e6-3831659f0613\") " Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.016564 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d09e0a6f-89c0-466f-93e6-3831659f0613-kube-api-access-7mb7j" (OuterVolumeSpecName: "kube-api-access-7mb7j") pod "d09e0a6f-89c0-466f-93e6-3831659f0613" (UID: "d09e0a6f-89c0-466f-93e6-3831659f0613"). InnerVolumeSpecName "kube-api-access-7mb7j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.050738 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d09e0a6f-89c0-466f-93e6-3831659f0613" (UID: "d09e0a6f-89c0-466f-93e6-3831659f0613"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.054652 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-config-data" (OuterVolumeSpecName: "config-data") pod "d09e0a6f-89c0-466f-93e6-3831659f0613" (UID: "d09e0a6f-89c0-466f-93e6-3831659f0613"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.110837 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mb7j\" (UniqueName: \"kubernetes.io/projected/d09e0a6f-89c0-466f-93e6-3831659f0613-kube-api-access-7mb7j\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.110889 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.110907 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09e0a6f-89c0-466f-93e6-3831659f0613-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.694400 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d09e0a6f-89c0-466f-93e6-3831659f0613","Type":"ContainerDied","Data":"aadd7a5c96a04e79659298f42da90d211813790206986ac7c030f700ab505ee5"} Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.694693 5184 scope.go:117] "RemoveContainer" containerID="6d27b4ccdee5ad0e4b967ae1ea93f0b007f25aa89ae4261b4e5c7f01be9ed54f" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.694444 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.718096 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.725789 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.745038 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.746418 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d09e0a6f-89c0-466f-93e6-3831659f0613" containerName="nova-scheduler-scheduler" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.746438 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d09e0a6f-89c0-466f-93e6-3831659f0613" containerName="nova-scheduler-scheduler" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.746773 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="d09e0a6f-89c0-466f-93e6-3831659f0613" containerName="nova-scheduler-scheduler" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.754061 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.756268 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.758439 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.822596 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lf57\" (UniqueName: \"kubernetes.io/projected/ff1db6e5-9786-448c-bb54-82eb3ea089a6-kube-api-access-5lf57\") pod \"nova-scheduler-0\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.822690 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-config-data\") pod \"nova-scheduler-0\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.822809 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.924346 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lf57\" (UniqueName: \"kubernetes.io/projected/ff1db6e5-9786-448c-bb54-82eb3ea089a6-kube-api-access-5lf57\") pod \"nova-scheduler-0\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.924424 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-config-data\") pod \"nova-scheduler-0\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.924510 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.928919 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-config-data\") pod \"nova-scheduler-0\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.929664 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:54 crc kubenswrapper[5184]: I0312 17:10:54.949310 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lf57\" (UniqueName: \"kubernetes.io/projected/ff1db6e5-9786-448c-bb54-82eb3ea089a6-kube-api-access-5lf57\") pod \"nova-scheduler-0\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " pod="openstack/nova-scheduler-0" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.114936 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.334761 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.434562 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmg7r\" (UniqueName: \"kubernetes.io/projected/63387085-af7f-404d-bfce-0df2471fbad4-kube-api-access-dmg7r\") pod \"63387085-af7f-404d-bfce-0df2471fbad4\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.434723 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63387085-af7f-404d-bfce-0df2471fbad4-logs\") pod \"63387085-af7f-404d-bfce-0df2471fbad4\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.434793 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-config-data\") pod \"63387085-af7f-404d-bfce-0df2471fbad4\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.434903 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-combined-ca-bundle\") pod \"63387085-af7f-404d-bfce-0df2471fbad4\" (UID: \"63387085-af7f-404d-bfce-0df2471fbad4\") " Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.435548 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63387085-af7f-404d-bfce-0df2471fbad4-logs" (OuterVolumeSpecName: "logs") pod "63387085-af7f-404d-bfce-0df2471fbad4" (UID: "63387085-af7f-404d-bfce-0df2471fbad4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.440911 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63387085-af7f-404d-bfce-0df2471fbad4-kube-api-access-dmg7r" (OuterVolumeSpecName: "kube-api-access-dmg7r") pod "63387085-af7f-404d-bfce-0df2471fbad4" (UID: "63387085-af7f-404d-bfce-0df2471fbad4"). InnerVolumeSpecName "kube-api-access-dmg7r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.462538 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-config-data" (OuterVolumeSpecName: "config-data") pod "63387085-af7f-404d-bfce-0df2471fbad4" (UID: "63387085-af7f-404d-bfce-0df2471fbad4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.466650 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63387085-af7f-404d-bfce-0df2471fbad4" (UID: "63387085-af7f-404d-bfce-0df2471fbad4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.538429 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.538482 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63387085-af7f-404d-bfce-0df2471fbad4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.538503 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dmg7r\" (UniqueName: \"kubernetes.io/projected/63387085-af7f-404d-bfce-0df2471fbad4-kube-api-access-dmg7r\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.538524 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63387085-af7f-404d-bfce-0df2471fbad4-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.581480 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:10:55 crc kubenswrapper[5184]: W0312 17:10:55.588972 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff1db6e5_9786_448c_bb54_82eb3ea089a6.slice/crio-2297395eca0112918c9a4fc740dc0fd23ef62316ed60d953e911185a76858943 WatchSource:0}: Error finding container 2297395eca0112918c9a4fc740dc0fd23ef62316ed60d953e911185a76858943: Status 404 returned error can't find the container with id 2297395eca0112918c9a4fc740dc0fd23ef62316ed60d953e911185a76858943 Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.715700 5184 generic.go:358] "Generic (PLEG): container finished" podID="63387085-af7f-404d-bfce-0df2471fbad4" containerID="3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c" exitCode=0 Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.715752 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63387085-af7f-404d-bfce-0df2471fbad4","Type":"ContainerDied","Data":"3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c"} Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.715805 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63387085-af7f-404d-bfce-0df2471fbad4","Type":"ContainerDied","Data":"4b12df73144d58187505415d2d7eb0b38621b6d974d9576b86848f55fe71ea3a"} Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.715832 5184 scope.go:117] "RemoveContainer" containerID="3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.715875 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.719763 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff1db6e5-9786-448c-bb54-82eb3ea089a6","Type":"ContainerStarted","Data":"2297395eca0112918c9a4fc740dc0fd23ef62316ed60d953e911185a76858943"} Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.769075 5184 scope.go:117] "RemoveContainer" containerID="77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.782651 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.799339 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.811844 5184 scope.go:117] "RemoveContainer" containerID="3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c" Mar 12 17:10:55 crc kubenswrapper[5184]: E0312 17:10:55.812372 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c\": container with ID starting with 3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c not found: ID does not exist" containerID="3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.812412 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c"} err="failed to get container status \"3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c\": rpc error: code = NotFound desc = could not find container \"3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c\": container with ID starting with 3ae83fd738c85479041a117b03d4951b1f2f5a6ee0df6856d3de527b4299786c not found: ID does not exist" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.812431 5184 scope.go:117] "RemoveContainer" containerID="77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf" Mar 12 17:10:55 crc kubenswrapper[5184]: E0312 17:10:55.812749 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf\": container with ID starting with 77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf not found: ID does not exist" containerID="77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.812780 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf"} err="failed to get container status \"77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf\": rpc error: code = NotFound desc = could not find container \"77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf\": container with ID starting with 77f95d9aab4062457b2acaaccb5d419fb92830ec2e6f6cea676315a29c6b22cf not found: ID does not exist" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.814293 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.815738 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63387085-af7f-404d-bfce-0df2471fbad4" containerName="nova-api-api" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.815767 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="63387085-af7f-404d-bfce-0df2471fbad4" containerName="nova-api-api" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.815800 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63387085-af7f-404d-bfce-0df2471fbad4" containerName="nova-api-log" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.815810 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="63387085-af7f-404d-bfce-0df2471fbad4" containerName="nova-api-log" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.816059 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="63387085-af7f-404d-bfce-0df2471fbad4" containerName="nova-api-api" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.816088 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="63387085-af7f-404d-bfce-0df2471fbad4" containerName="nova-api-log" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.820972 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.826982 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.843586 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.947064 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.947525 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c285c573-43cc-4990-86ba-9b28a19e98aa-logs\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.947555 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-config-data\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:55 crc kubenswrapper[5184]: I0312 17:10:55.947602 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmbst\" (UniqueName: \"kubernetes.io/projected/c285c573-43cc-4990-86ba-9b28a19e98aa-kube-api-access-xmbst\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.049666 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.049751 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c285c573-43cc-4990-86ba-9b28a19e98aa-logs\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.049781 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-config-data\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.049805 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmbst\" (UniqueName: \"kubernetes.io/projected/c285c573-43cc-4990-86ba-9b28a19e98aa-kube-api-access-xmbst\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.053199 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c285c573-43cc-4990-86ba-9b28a19e98aa-logs\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.058512 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.063507 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-config-data\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.076233 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmbst\" (UniqueName: \"kubernetes.io/projected/c285c573-43cc-4990-86ba-9b28a19e98aa-kube-api-access-xmbst\") pod \"nova-api-0\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " pod="openstack/nova-api-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.151290 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.151467 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.154140 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.412637 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63387085-af7f-404d-bfce-0df2471fbad4" path="/var/lib/kubelet/pods/63387085-af7f-404d-bfce-0df2471fbad4/volumes" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.414054 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d09e0a6f-89c0-466f-93e6-3831659f0613" path="/var/lib/kubelet/pods/d09e0a6f-89c0-466f-93e6-3831659f0613/volumes" Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.593456 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.740298 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff1db6e5-9786-448c-bb54-82eb3ea089a6","Type":"ContainerStarted","Data":"25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6"} Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.750763 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c285c573-43cc-4990-86ba-9b28a19e98aa","Type":"ContainerStarted","Data":"12a9df724d91fa57eb7e2d40cb1afd793a3962d25ced14bc8289665090066586"} Mar 12 17:10:56 crc kubenswrapper[5184]: I0312 17:10:56.765097 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.765077819 podStartE2EDuration="2.765077819s" podCreationTimestamp="2026-03-12 17:10:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:56.759138324 +0000 UTC m=+1199.300449683" watchObservedRunningTime="2026-03-12 17:10:56.765077819 +0000 UTC m=+1199.306389158" Mar 12 17:10:57 crc kubenswrapper[5184]: I0312 17:10:57.779211 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c285c573-43cc-4990-86ba-9b28a19e98aa","Type":"ContainerStarted","Data":"18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0"} Mar 12 17:10:57 crc kubenswrapper[5184]: I0312 17:10:57.779644 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c285c573-43cc-4990-86ba-9b28a19e98aa","Type":"ContainerStarted","Data":"ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910"} Mar 12 17:10:57 crc kubenswrapper[5184]: I0312 17:10:57.810289 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.810265921 podStartE2EDuration="2.810265921s" podCreationTimestamp="2026-03-12 17:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:10:57.796964328 +0000 UTC m=+1200.338275677" watchObservedRunningTime="2026-03-12 17:10:57.810265921 +0000 UTC m=+1200.351577270" Mar 12 17:10:58 crc kubenswrapper[5184]: I0312 17:10:58.721153 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 12 17:10:58 crc kubenswrapper[5184]: I0312 17:10:58.970512 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:10:58 crc kubenswrapper[5184]: I0312 17:10:58.970514 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:10:58 crc kubenswrapper[5184]: I0312 17:10:58.981083 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:10:58 crc kubenswrapper[5184]: I0312 17:10:58.981093 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:11:00 crc kubenswrapper[5184]: I0312 17:11:00.115746 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Mar 12 17:11:01 crc kubenswrapper[5184]: I0312 17:11:01.151618 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 17:11:01 crc kubenswrapper[5184]: I0312 17:11:01.152051 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 17:11:02 crc kubenswrapper[5184]: I0312 17:11:02.161594 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 17:11:02 crc kubenswrapper[5184]: I0312 17:11:02.161629 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 17:11:05 crc kubenswrapper[5184]: I0312 17:11:05.115711 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 17:11:05 crc kubenswrapper[5184]: I0312 17:11:05.151626 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 17:11:05 crc kubenswrapper[5184]: I0312 17:11:05.928658 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 17:11:06 crc kubenswrapper[5184]: I0312 17:11:06.153568 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 17:11:06 crc kubenswrapper[5184]: I0312 17:11:06.153655 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 17:11:07 crc kubenswrapper[5184]: I0312 17:11:07.194597 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 17:11:07 crc kubenswrapper[5184]: I0312 17:11:07.194602 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 17:11:10 crc kubenswrapper[5184]: I0312 17:11:10.288322 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 12 17:11:11 crc kubenswrapper[5184]: I0312 17:11:11.155909 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 17:11:11 crc kubenswrapper[5184]: I0312 17:11:11.164130 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 17:11:11 crc kubenswrapper[5184]: I0312 17:11:11.168062 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 17:11:11 crc kubenswrapper[5184]: I0312 17:11:11.965744 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.814624 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.830008 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-config-data\") pod \"02994126-30bf-4b42-be17-a1fdb7ad571a\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.830130 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgf8j\" (UniqueName: \"kubernetes.io/projected/02994126-30bf-4b42-be17-a1fdb7ad571a-kube-api-access-zgf8j\") pod \"02994126-30bf-4b42-be17-a1fdb7ad571a\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.830393 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-combined-ca-bundle\") pod \"02994126-30bf-4b42-be17-a1fdb7ad571a\" (UID: \"02994126-30bf-4b42-be17-a1fdb7ad571a\") " Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.847566 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02994126-30bf-4b42-be17-a1fdb7ad571a-kube-api-access-zgf8j" (OuterVolumeSpecName: "kube-api-access-zgf8j") pod "02994126-30bf-4b42-be17-a1fdb7ad571a" (UID: "02994126-30bf-4b42-be17-a1fdb7ad571a"). InnerVolumeSpecName "kube-api-access-zgf8j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.892782 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02994126-30bf-4b42-be17-a1fdb7ad571a" (UID: "02994126-30bf-4b42-be17-a1fdb7ad571a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.933100 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.933146 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zgf8j\" (UniqueName: \"kubernetes.io/projected/02994126-30bf-4b42-be17-a1fdb7ad571a-kube-api-access-zgf8j\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.950511 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-config-data" (OuterVolumeSpecName: "config-data") pod "02994126-30bf-4b42-be17-a1fdb7ad571a" (UID: "02994126-30bf-4b42-be17-a1fdb7ad571a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.977916 5184 generic.go:358] "Generic (PLEG): container finished" podID="02994126-30bf-4b42-be17-a1fdb7ad571a" containerID="0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88" exitCode=137 Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.977986 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.978472 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"02994126-30bf-4b42-be17-a1fdb7ad571a","Type":"ContainerDied","Data":"0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88"} Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.978511 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"02994126-30bf-4b42-be17-a1fdb7ad571a","Type":"ContainerDied","Data":"6cc88a833ecfc359cd015328b4d417715e56bc705b2944894a1c9c95e1c37d36"} Mar 12 17:11:12 crc kubenswrapper[5184]: I0312 17:11:12.978528 5184 scope.go:117] "RemoveContainer" containerID="0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.011521 5184 scope.go:117] "RemoveContainer" containerID="0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88" Mar 12 17:11:13 crc kubenswrapper[5184]: E0312 17:11:13.012580 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88\": container with ID starting with 0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88 not found: ID does not exist" containerID="0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.012618 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88"} err="failed to get container status \"0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88\": rpc error: code = NotFound desc = could not find container \"0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88\": container with ID starting with 0f57134c846eea015a8a8899f670d70d941602ffd3bea4efbec4f89dee571a88 not found: ID does not exist" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.015729 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.028543 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.035174 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02994126-30bf-4b42-be17-a1fdb7ad571a-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.038189 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.039341 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02994126-30bf-4b42-be17-a1fdb7ad571a" containerName="nova-cell1-novncproxy-novncproxy" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.039394 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="02994126-30bf-4b42-be17-a1fdb7ad571a" containerName="nova-cell1-novncproxy-novncproxy" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.039623 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="02994126-30bf-4b42-be17-a1fdb7ad571a" containerName="nova-cell1-novncproxy-novncproxy" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.045241 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.050910 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-novncproxy-cell1-public-svc\"" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.051554 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-novncproxy-cell1-vencrypt\"" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.051571 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-novncproxy-config-data\"" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.068096 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.137437 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.137508 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.137612 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.137643 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf94b\" (UniqueName: \"kubernetes.io/projected/d0e5f35a-1622-4ec0-84e7-56af1f798978-kube-api-access-zf94b\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.137661 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.239258 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.239363 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.239405 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf94b\" (UniqueName: \"kubernetes.io/projected/d0e5f35a-1622-4ec0-84e7-56af1f798978-kube-api-access-zf94b\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.239422 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.239511 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.245861 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.246340 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.246360 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.246932 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0e5f35a-1622-4ec0-84e7-56af1f798978-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.262931 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf94b\" (UniqueName: \"kubernetes.io/projected/d0e5f35a-1622-4ec0-84e7-56af1f798978-kube-api-access-zf94b\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0e5f35a-1622-4ec0-84e7-56af1f798978\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.379260 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.855241 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 17:11:13 crc kubenswrapper[5184]: W0312 17:11:13.865852 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0e5f35a_1622_4ec0_84e7_56af1f798978.slice/crio-d26e0e2eeeb1d2ecd4593a1a65e9c4d4fec0ca3062c408b1e1cdd958a7ae09fe WatchSource:0}: Error finding container d26e0e2eeeb1d2ecd4593a1a65e9c4d4fec0ca3062c408b1e1cdd958a7ae09fe: Status 404 returned error can't find the container with id d26e0e2eeeb1d2ecd4593a1a65e9c4d4fec0ca3062c408b1e1cdd958a7ae09fe Mar 12 17:11:13 crc kubenswrapper[5184]: I0312 17:11:13.995244 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d0e5f35a-1622-4ec0-84e7-56af1f798978","Type":"ContainerStarted","Data":"d26e0e2eeeb1d2ecd4593a1a65e9c4d4fec0ca3062c408b1e1cdd958a7ae09fe"} Mar 12 17:11:14 crc kubenswrapper[5184]: I0312 17:11:14.261101 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 17:11:14 crc kubenswrapper[5184]: I0312 17:11:14.261669 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="37dd5ca0-dd94-458b-93c2-393f9c4db4b7" containerName="kube-state-metrics" containerID="cri-o://8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882" gracePeriod=30 Mar 12 17:11:14 crc kubenswrapper[5184]: I0312 17:11:14.417145 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02994126-30bf-4b42-be17-a1fdb7ad571a" path="/var/lib/kubelet/pods/02994126-30bf-4b42-be17-a1fdb7ad571a/volumes" Mar 12 17:11:14 crc kubenswrapper[5184]: I0312 17:11:14.780233 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 17:11:14 crc kubenswrapper[5184]: I0312 17:11:14.904188 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28n4s\" (UniqueName: \"kubernetes.io/projected/37dd5ca0-dd94-458b-93c2-393f9c4db4b7-kube-api-access-28n4s\") pod \"37dd5ca0-dd94-458b-93c2-393f9c4db4b7\" (UID: \"37dd5ca0-dd94-458b-93c2-393f9c4db4b7\") " Mar 12 17:11:14 crc kubenswrapper[5184]: I0312 17:11:14.917624 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37dd5ca0-dd94-458b-93c2-393f9c4db4b7-kube-api-access-28n4s" (OuterVolumeSpecName: "kube-api-access-28n4s") pod "37dd5ca0-dd94-458b-93c2-393f9c4db4b7" (UID: "37dd5ca0-dd94-458b-93c2-393f9c4db4b7"). InnerVolumeSpecName "kube-api-access-28n4s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.006055 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28n4s\" (UniqueName: \"kubernetes.io/projected/37dd5ca0-dd94-458b-93c2-393f9c4db4b7-kube-api-access-28n4s\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.010739 5184 generic.go:358] "Generic (PLEG): container finished" podID="37dd5ca0-dd94-458b-93c2-393f9c4db4b7" containerID="8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882" exitCode=2 Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.010818 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"37dd5ca0-dd94-458b-93c2-393f9c4db4b7","Type":"ContainerDied","Data":"8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882"} Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.010903 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.010921 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"37dd5ca0-dd94-458b-93c2-393f9c4db4b7","Type":"ContainerDied","Data":"2d2706396f0780be9dfe3bf59d0670ad011709be1e6f4de1b9254ef23bc223e5"} Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.010962 5184 scope.go:117] "RemoveContainer" containerID="8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.015542 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d0e5f35a-1622-4ec0-84e7-56af1f798978","Type":"ContainerStarted","Data":"4a30c5015943bb220447b4531bab20cdaffe4a73240ee8d9b4369f063c3a60ca"} Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.034990 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.03497068 podStartE2EDuration="2.03497068s" podCreationTimestamp="2026-03-12 17:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:11:15.028665503 +0000 UTC m=+1217.569976842" watchObservedRunningTime="2026-03-12 17:11:15.03497068 +0000 UTC m=+1217.576282019" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.053416 5184 scope.go:117] "RemoveContainer" containerID="8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882" Mar 12 17:11:15 crc kubenswrapper[5184]: E0312 17:11:15.053977 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882\": container with ID starting with 8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882 not found: ID does not exist" containerID="8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.054122 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882"} err="failed to get container status \"8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882\": rpc error: code = NotFound desc = could not find container \"8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882\": container with ID starting with 8c920ffc50cc10a5d5ef5722f55a325a8cdb6bda7229ef5be374ce4cbb95f882 not found: ID does not exist" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.063235 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.076668 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.094739 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.095868 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37dd5ca0-dd94-458b-93c2-393f9c4db4b7" containerName="kube-state-metrics" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.095886 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="37dd5ca0-dd94-458b-93c2-393f9c4db4b7" containerName="kube-state-metrics" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.096074 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="37dd5ca0-dd94-458b-93c2-393f9c4db4b7" containerName="kube-state-metrics" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.104709 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.107519 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.108393 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-kube-state-metrics-svc\"" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.108602 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"kube-state-metrics-tls-config\"" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.109078 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/418fa381-87e7-47c8-9136-6c5d8d8028ce-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.109184 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418fa381-87e7-47c8-9136-6c5d8d8028ce-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.109208 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trs96\" (UniqueName: \"kubernetes.io/projected/418fa381-87e7-47c8-9136-6c5d8d8028ce-kube-api-access-trs96\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.109269 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/418fa381-87e7-47c8-9136-6c5d8d8028ce-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.210656 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418fa381-87e7-47c8-9136-6c5d8d8028ce-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.210708 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trs96\" (UniqueName: \"kubernetes.io/projected/418fa381-87e7-47c8-9136-6c5d8d8028ce-kube-api-access-trs96\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.211057 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/418fa381-87e7-47c8-9136-6c5d8d8028ce-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.211509 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/418fa381-87e7-47c8-9136-6c5d8d8028ce-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.215954 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/418fa381-87e7-47c8-9136-6c5d8d8028ce-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.216853 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/418fa381-87e7-47c8-9136-6c5d8d8028ce-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.216974 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/418fa381-87e7-47c8-9136-6c5d8d8028ce-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.229766 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trs96\" (UniqueName: \"kubernetes.io/projected/418fa381-87e7-47c8-9136-6c5d8d8028ce-kube-api-access-trs96\") pod \"kube-state-metrics-0\" (UID: \"418fa381-87e7-47c8-9136-6c5d8d8028ce\") " pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.428031 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 17:11:15 crc kubenswrapper[5184]: W0312 17:11:15.915465 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod418fa381_87e7_47c8_9136_6c5d8d8028ce.slice/crio-c9e3efb6ae41fd94a3f6895da659c6a714634f00ed019abf426dd0ad929686a4 WatchSource:0}: Error finding container c9e3efb6ae41fd94a3f6895da659c6a714634f00ed019abf426dd0ad929686a4: Status 404 returned error can't find the container with id c9e3efb6ae41fd94a3f6895da659c6a714634f00ed019abf426dd0ad929686a4 Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.919735 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.976108 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.976476 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="ceilometer-central-agent" containerID="cri-o://627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787" gracePeriod=30 Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.976647 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="proxy-httpd" containerID="cri-o://4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a" gracePeriod=30 Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.976702 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="sg-core" containerID="cri-o://2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4" gracePeriod=30 Mar 12 17:11:15 crc kubenswrapper[5184]: I0312 17:11:15.976749 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="ceilometer-notification-agent" containerID="cri-o://34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56" gracePeriod=30 Mar 12 17:11:16 crc kubenswrapper[5184]: I0312 17:11:16.025185 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"418fa381-87e7-47c8-9136-6c5d8d8028ce","Type":"ContainerStarted","Data":"c9e3efb6ae41fd94a3f6895da659c6a714634f00ed019abf426dd0ad929686a4"} Mar 12 17:11:16 crc kubenswrapper[5184]: I0312 17:11:16.156796 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 17:11:16 crc kubenswrapper[5184]: I0312 17:11:16.157420 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Mar 12 17:11:16 crc kubenswrapper[5184]: I0312 17:11:16.162717 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 17:11:16 crc kubenswrapper[5184]: I0312 17:11:16.162774 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 17:11:16 crc kubenswrapper[5184]: I0312 17:11:16.414885 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37dd5ca0-dd94-458b-93c2-393f9c4db4b7" path="/var/lib/kubelet/pods/37dd5ca0-dd94-458b-93c2-393f9c4db4b7/volumes" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.046739 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"418fa381-87e7-47c8-9136-6c5d8d8028ce","Type":"ContainerStarted","Data":"a8f9b5fc7feb406866711dcf494c064e505ea6197ead312107b20f635028b574"} Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.046878 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/kube-state-metrics-0" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.061060 5184 generic.go:358] "Generic (PLEG): container finished" podID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerID="4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a" exitCode=0 Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.061095 5184 generic.go:358] "Generic (PLEG): container finished" podID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerID="2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4" exitCode=2 Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.061105 5184 generic.go:358] "Generic (PLEG): container finished" podID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerID="627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787" exitCode=0 Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.061242 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ba21e1-9b66-4713-9374-97ec0c9dd749","Type":"ContainerDied","Data":"4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a"} Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.061293 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ba21e1-9b66-4713-9374-97ec0c9dd749","Type":"ContainerDied","Data":"2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4"} Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.061318 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ba21e1-9b66-4713-9374-97ec0c9dd749","Type":"ContainerDied","Data":"627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787"} Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.061695 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.066278 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.084770 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.701603567 podStartE2EDuration="2.084747231s" podCreationTimestamp="2026-03-12 17:11:15 +0000 UTC" firstStartedPulling="2026-03-12 17:11:15.917772102 +0000 UTC m=+1218.459083441" lastFinishedPulling="2026-03-12 17:11:16.300915766 +0000 UTC m=+1218.842227105" observedRunningTime="2026-03-12 17:11:17.069834597 +0000 UTC m=+1219.611146036" watchObservedRunningTime="2026-03-12 17:11:17.084747231 +0000 UTC m=+1219.626058580" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.282542 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5598b9c58f-84z9t"] Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.288163 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.291970 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5598b9c58f-84z9t"] Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.358987 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-swift-storage-0\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.359040 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-sb\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.359110 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvrr\" (UniqueName: \"kubernetes.io/projected/ea315f0f-053b-44e2-b1ea-1058f7a51635-kube-api-access-fmvrr\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.359990 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-config\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.360021 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-svc\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.360095 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-nb\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.462280 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-nb\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.462702 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-swift-storage-0\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.462802 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-sb\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.463254 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvrr\" (UniqueName: \"kubernetes.io/projected/ea315f0f-053b-44e2-b1ea-1058f7a51635-kube-api-access-fmvrr\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.463340 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-config\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.463367 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-svc\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.464212 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-config\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.464908 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-svc\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.465991 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-swift-storage-0\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.466362 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-sb\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.466655 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-nb\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.486138 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvrr\" (UniqueName: \"kubernetes.io/projected/ea315f0f-053b-44e2-b1ea-1058f7a51635-kube-api-access-fmvrr\") pod \"dnsmasq-dns-5598b9c58f-84z9t\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:17 crc kubenswrapper[5184]: I0312 17:11:17.612484 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.054935 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5598b9c58f-84z9t"] Mar 12 17:11:18 crc kubenswrapper[5184]: W0312 17:11:18.060169 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea315f0f_053b_44e2_b1ea_1058f7a51635.slice/crio-f61836aff649b61a28e56a4e4d8e4f38866d5fc630f597a81beaeb9f078f9f81 WatchSource:0}: Error finding container f61836aff649b61a28e56a4e4d8e4f38866d5fc630f597a81beaeb9f078f9f81: Status 404 returned error can't find the container with id f61836aff649b61a28e56a4e4d8e4f38866d5fc630f597a81beaeb9f078f9f81 Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.077108 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" event={"ID":"ea315f0f-053b-44e2-b1ea-1058f7a51635","Type":"ContainerStarted","Data":"f61836aff649b61a28e56a4e4d8e4f38866d5fc630f597a81beaeb9f078f9f81"} Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.386384 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.665561 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.691151 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-sg-core-conf-yaml\") pod \"12ba21e1-9b66-4713-9374-97ec0c9dd749\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.691195 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-scripts\") pod \"12ba21e1-9b66-4713-9374-97ec0c9dd749\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.691385 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-run-httpd\") pod \"12ba21e1-9b66-4713-9374-97ec0c9dd749\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.691427 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-config-data\") pod \"12ba21e1-9b66-4713-9374-97ec0c9dd749\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.691514 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlfvg\" (UniqueName: \"kubernetes.io/projected/12ba21e1-9b66-4713-9374-97ec0c9dd749-kube-api-access-xlfvg\") pod \"12ba21e1-9b66-4713-9374-97ec0c9dd749\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.691615 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-combined-ca-bundle\") pod \"12ba21e1-9b66-4713-9374-97ec0c9dd749\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.691710 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-log-httpd\") pod \"12ba21e1-9b66-4713-9374-97ec0c9dd749\" (UID: \"12ba21e1-9b66-4713-9374-97ec0c9dd749\") " Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.693549 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "12ba21e1-9b66-4713-9374-97ec0c9dd749" (UID: "12ba21e1-9b66-4713-9374-97ec0c9dd749"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.698351 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "12ba21e1-9b66-4713-9374-97ec0c9dd749" (UID: "12ba21e1-9b66-4713-9374-97ec0c9dd749"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.716071 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-scripts" (OuterVolumeSpecName: "scripts") pod "12ba21e1-9b66-4713-9374-97ec0c9dd749" (UID: "12ba21e1-9b66-4713-9374-97ec0c9dd749"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.732987 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ba21e1-9b66-4713-9374-97ec0c9dd749-kube-api-access-xlfvg" (OuterVolumeSpecName: "kube-api-access-xlfvg") pod "12ba21e1-9b66-4713-9374-97ec0c9dd749" (UID: "12ba21e1-9b66-4713-9374-97ec0c9dd749"). InnerVolumeSpecName "kube-api-access-xlfvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.743697 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "12ba21e1-9b66-4713-9374-97ec0c9dd749" (UID: "12ba21e1-9b66-4713-9374-97ec0c9dd749"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.794169 5184 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.794197 5184 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.794207 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.794217 5184 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12ba21e1-9b66-4713-9374-97ec0c9dd749-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.794225 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xlfvg\" (UniqueName: \"kubernetes.io/projected/12ba21e1-9b66-4713-9374-97ec0c9dd749-kube-api-access-xlfvg\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.829567 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12ba21e1-9b66-4713-9374-97ec0c9dd749" (UID: "12ba21e1-9b66-4713-9374-97ec0c9dd749"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.831935 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-config-data" (OuterVolumeSpecName: "config-data") pod "12ba21e1-9b66-4713-9374-97ec0c9dd749" (UID: "12ba21e1-9b66-4713-9374-97ec0c9dd749"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.896203 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:18 crc kubenswrapper[5184]: I0312 17:11:18.896536 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba21e1-9b66-4713-9374-97ec0c9dd749-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.088014 5184 generic.go:358] "Generic (PLEG): container finished" podID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerID="34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56" exitCode=0 Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.088338 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ba21e1-9b66-4713-9374-97ec0c9dd749","Type":"ContainerDied","Data":"34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56"} Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.088365 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12ba21e1-9b66-4713-9374-97ec0c9dd749","Type":"ContainerDied","Data":"29386866818a11ade90cd289c89106ea472473d8b2b137159a780ed4181e27b1"} Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.088399 5184 scope.go:117] "RemoveContainer" containerID="4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.088545 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.091801 5184 generic.go:358] "Generic (PLEG): container finished" podID="ea315f0f-053b-44e2-b1ea-1058f7a51635" containerID="6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da" exitCode=0 Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.091896 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" event={"ID":"ea315f0f-053b-44e2-b1ea-1058f7a51635","Type":"ContainerDied","Data":"6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da"} Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.126355 5184 scope.go:117] "RemoveContainer" containerID="2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.148762 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.158791 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.165299 5184 scope.go:117] "RemoveContainer" containerID="34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.182568 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184542 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="sg-core" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184569 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="sg-core" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184581 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="ceilometer-notification-agent" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184589 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="ceilometer-notification-agent" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184616 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="ceilometer-central-agent" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184623 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="ceilometer-central-agent" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184632 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="proxy-httpd" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184639 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="proxy-httpd" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184843 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="ceilometer-central-agent" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184859 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="ceilometer-notification-agent" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184866 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="proxy-httpd" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.184876 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" containerName="sg-core" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.198643 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.198822 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.200945 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.201134 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.201319 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ceilometer-internal-svc\"" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.224660 5184 scope.go:117] "RemoveContainer" containerID="627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.246044 5184 scope.go:117] "RemoveContainer" containerID="4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a" Mar 12 17:11:19 crc kubenswrapper[5184]: E0312 17:11:19.246524 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a\": container with ID starting with 4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a not found: ID does not exist" containerID="4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.246564 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a"} err="failed to get container status \"4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a\": rpc error: code = NotFound desc = could not find container \"4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a\": container with ID starting with 4e7b6885626b294ee38569a2d2efb980002fef1f150ce7488e61752ce28d2e6a not found: ID does not exist" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.246590 5184 scope.go:117] "RemoveContainer" containerID="2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4" Mar 12 17:11:19 crc kubenswrapper[5184]: E0312 17:11:19.246961 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4\": container with ID starting with 2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4 not found: ID does not exist" containerID="2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.246983 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4"} err="failed to get container status \"2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4\": rpc error: code = NotFound desc = could not find container \"2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4\": container with ID starting with 2aaa4626a5ecff9c133f8b359189360dc610f2e41153643462abb5bc0e2411b4 not found: ID does not exist" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.247029 5184 scope.go:117] "RemoveContainer" containerID="34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56" Mar 12 17:11:19 crc kubenswrapper[5184]: E0312 17:11:19.247275 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56\": container with ID starting with 34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56 not found: ID does not exist" containerID="34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.247309 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56"} err="failed to get container status \"34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56\": rpc error: code = NotFound desc = could not find container \"34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56\": container with ID starting with 34247514e1a4e1de5dceca3077b32525b31839cd357cb12571bd1fcc7fc2bc56 not found: ID does not exist" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.247330 5184 scope.go:117] "RemoveContainer" containerID="627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787" Mar 12 17:11:19 crc kubenswrapper[5184]: E0312 17:11:19.247554 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787\": container with ID starting with 627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787 not found: ID does not exist" containerID="627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.247578 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787"} err="failed to get container status \"627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787\": rpc error: code = NotFound desc = could not find container \"627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787\": container with ID starting with 627a62c9e016b7d3032b119feaffb76b7523cdf6b46fcf7b023c8af26bdd3787 not found: ID does not exist" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.304142 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.304257 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-config-data\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.304492 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-run-httpd\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.304618 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.304693 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-log-httpd\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.304721 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-scripts\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.304813 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r55k\" (UniqueName: \"kubernetes.io/projected/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-kube-api-access-5r55k\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.304988 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.406623 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.406699 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-config-data\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.407796 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-run-httpd\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.407975 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.408145 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-log-httpd\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.408215 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-run-httpd\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.408237 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-scripts\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.408410 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r55k\" (UniqueName: \"kubernetes.io/projected/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-kube-api-access-5r55k\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.408491 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-log-httpd\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.408592 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.412717 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-scripts\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.413153 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.413286 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.413912 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-config-data\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.414024 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.439063 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r55k\" (UniqueName: \"kubernetes.io/projected/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-kube-api-access-5r55k\") pod \"ceilometer-0\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.517529 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:11:19 crc kubenswrapper[5184]: I0312 17:11:19.777628 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:20 crc kubenswrapper[5184]: I0312 17:11:20.025643 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:20 crc kubenswrapper[5184]: I0312 17:11:20.054442 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:20 crc kubenswrapper[5184]: W0312 17:11:20.057772 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac21ef93_e11b_4523_bc45_ca6b734b3a8d.slice/crio-fb68dfe730a64af5736d28b624b5846ae3096e520e0b4e84d5a87c3bc7a96011 WatchSource:0}: Error finding container fb68dfe730a64af5736d28b624b5846ae3096e520e0b4e84d5a87c3bc7a96011: Status 404 returned error can't find the container with id fb68dfe730a64af5736d28b624b5846ae3096e520e0b4e84d5a87c3bc7a96011 Mar 12 17:11:20 crc kubenswrapper[5184]: I0312 17:11:20.104885 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" event={"ID":"ea315f0f-053b-44e2-b1ea-1058f7a51635","Type":"ContainerStarted","Data":"1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596"} Mar 12 17:11:20 crc kubenswrapper[5184]: I0312 17:11:20.105067 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:20 crc kubenswrapper[5184]: I0312 17:11:20.106555 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac21ef93-e11b-4523-bc45-ca6b734b3a8d","Type":"ContainerStarted","Data":"fb68dfe730a64af5736d28b624b5846ae3096e520e0b4e84d5a87c3bc7a96011"} Mar 12 17:11:20 crc kubenswrapper[5184]: I0312 17:11:20.109442 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerName="nova-api-log" containerID="cri-o://ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910" gracePeriod=30 Mar 12 17:11:20 crc kubenswrapper[5184]: I0312 17:11:20.109451 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerName="nova-api-api" containerID="cri-o://18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0" gracePeriod=30 Mar 12 17:11:20 crc kubenswrapper[5184]: I0312 17:11:20.130014 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" podStartSLOduration=3.129987137 podStartE2EDuration="3.129987137s" podCreationTimestamp="2026-03-12 17:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:11:20.122798954 +0000 UTC m=+1222.664110313" watchObservedRunningTime="2026-03-12 17:11:20.129987137 +0000 UTC m=+1222.671298476" Mar 12 17:11:20 crc kubenswrapper[5184]: I0312 17:11:20.413809 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ba21e1-9b66-4713-9374-97ec0c9dd749" path="/var/lib/kubelet/pods/12ba21e1-9b66-4713-9374-97ec0c9dd749/volumes" Mar 12 17:11:21 crc kubenswrapper[5184]: I0312 17:11:21.119479 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac21ef93-e11b-4523-bc45-ca6b734b3a8d","Type":"ContainerStarted","Data":"ddaf4697111ce1cc4cb4051d8a0af0b01d1b66b76a5937bc3697bb5a0a7be36e"} Mar 12 17:11:21 crc kubenswrapper[5184]: I0312 17:11:21.121608 5184 generic.go:358] "Generic (PLEG): container finished" podID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerID="ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910" exitCode=143 Mar 12 17:11:21 crc kubenswrapper[5184]: I0312 17:11:21.121655 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c285c573-43cc-4990-86ba-9b28a19e98aa","Type":"ContainerDied","Data":"ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910"} Mar 12 17:11:22 crc kubenswrapper[5184]: I0312 17:11:22.134425 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac21ef93-e11b-4523-bc45-ca6b734b3a8d","Type":"ContainerStarted","Data":"496b71038c00ee771db0a7a91843075dc55c91e395a3eedd946e133d09b8b87e"} Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.149477 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac21ef93-e11b-4523-bc45-ca6b734b3a8d","Type":"ContainerStarted","Data":"f2acdcc2c9a23af37b52dceeeb981a9988991051606ce8bada0dbc9f8a458c15"} Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.380755 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.412042 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.745075 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.798503 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmbst\" (UniqueName: \"kubernetes.io/projected/c285c573-43cc-4990-86ba-9b28a19e98aa-kube-api-access-xmbst\") pod \"c285c573-43cc-4990-86ba-9b28a19e98aa\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.798815 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-combined-ca-bundle\") pod \"c285c573-43cc-4990-86ba-9b28a19e98aa\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.798915 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c285c573-43cc-4990-86ba-9b28a19e98aa-logs\") pod \"c285c573-43cc-4990-86ba-9b28a19e98aa\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.798960 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-config-data\") pod \"c285c573-43cc-4990-86ba-9b28a19e98aa\" (UID: \"c285c573-43cc-4990-86ba-9b28a19e98aa\") " Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.806888 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c285c573-43cc-4990-86ba-9b28a19e98aa-logs" (OuterVolumeSpecName: "logs") pod "c285c573-43cc-4990-86ba-9b28a19e98aa" (UID: "c285c573-43cc-4990-86ba-9b28a19e98aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.831453 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c285c573-43cc-4990-86ba-9b28a19e98aa-kube-api-access-xmbst" (OuterVolumeSpecName: "kube-api-access-xmbst") pod "c285c573-43cc-4990-86ba-9b28a19e98aa" (UID: "c285c573-43cc-4990-86ba-9b28a19e98aa"). InnerVolumeSpecName "kube-api-access-xmbst". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.901152 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c285c573-43cc-4990-86ba-9b28a19e98aa-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.901183 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xmbst\" (UniqueName: \"kubernetes.io/projected/c285c573-43cc-4990-86ba-9b28a19e98aa-kube-api-access-xmbst\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:23 crc kubenswrapper[5184]: I0312 17:11:23.913293 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-config-data" (OuterVolumeSpecName: "config-data") pod "c285c573-43cc-4990-86ba-9b28a19e98aa" (UID: "c285c573-43cc-4990-86ba-9b28a19e98aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:23.999642 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c285c573-43cc-4990-86ba-9b28a19e98aa" (UID: "c285c573-43cc-4990-86ba-9b28a19e98aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.002689 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.002723 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c285c573-43cc-4990-86ba-9b28a19e98aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.159325 5184 generic.go:358] "Generic (PLEG): container finished" podID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerID="18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0" exitCode=0 Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.159418 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c285c573-43cc-4990-86ba-9b28a19e98aa","Type":"ContainerDied","Data":"18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0"} Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.159756 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c285c573-43cc-4990-86ba-9b28a19e98aa","Type":"ContainerDied","Data":"12a9df724d91fa57eb7e2d40cb1afd793a3962d25ced14bc8289665090066586"} Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.159780 5184 scope.go:117] "RemoveContainer" containerID="18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.159456 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.189508 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.193803 5184 scope.go:117] "RemoveContainer" containerID="ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.215649 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.236302 5184 scope.go:117] "RemoveContainer" containerID="18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0" Mar 12 17:11:24 crc kubenswrapper[5184]: E0312 17:11:24.236771 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0\": container with ID starting with 18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0 not found: ID does not exist" containerID="18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.236811 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0"} err="failed to get container status \"18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0\": rpc error: code = NotFound desc = could not find container \"18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0\": container with ID starting with 18b498a51d3670f6dcc5eebb73cd3de5b672b0c75c6816e32a7733dc199599d0 not found: ID does not exist" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.236834 5184 scope.go:117] "RemoveContainer" containerID="ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910" Mar 12 17:11:24 crc kubenswrapper[5184]: E0312 17:11:24.237064 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910\": container with ID starting with ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910 not found: ID does not exist" containerID="ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.237085 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910"} err="failed to get container status \"ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910\": rpc error: code = NotFound desc = could not find container \"ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910\": container with ID starting with ba46a47d9b20277c73a48b1fa3204af8ff958f85a4c540fda116ccf9fddb2910 not found: ID does not exist" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.240195 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.298752 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.299756 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerName="nova-api-api" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.299775 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerName="nova-api-api" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.299795 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerName="nova-api-log" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.299801 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerName="nova-api-log" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.299980 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerName="nova-api-log" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.300008 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" containerName="nova-api-api" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.305841 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.306865 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.309873 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-internal-svc\"" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.310037 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-public-svc\"" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.310557 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.416000 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9pzj\" (UniqueName: \"kubernetes.io/projected/75715e9f-e72b-4b1c-9b69-870f1766ba64-kube-api-access-f9pzj\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.416559 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-public-tls-certs\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.416675 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75715e9f-e72b-4b1c-9b69-870f1766ba64-logs\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.416856 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-config-data\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.416879 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.416919 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-internal-tls-certs\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.420310 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c285c573-43cc-4990-86ba-9b28a19e98aa" path="/var/lib/kubelet/pods/c285c573-43cc-4990-86ba-9b28a19e98aa/volumes" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.421192 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-t8d4k"] Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.429702 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-t8d4k"] Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.429802 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.433986 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-manage-config-data\"" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.433991 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-cell1-manage-scripts\"" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.519095 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.519262 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9pzj\" (UniqueName: \"kubernetes.io/projected/75715e9f-e72b-4b1c-9b69-870f1766ba64-kube-api-access-f9pzj\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.519302 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s95kh\" (UniqueName: \"kubernetes.io/projected/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-kube-api-access-s95kh\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.519324 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-public-tls-certs\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.519493 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75715e9f-e72b-4b1c-9b69-870f1766ba64-logs\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.519578 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-config-data\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.519642 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-config-data\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.519658 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.519695 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-internal-tls-certs\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.519715 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-scripts\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.522255 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75715e9f-e72b-4b1c-9b69-870f1766ba64-logs\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.525598 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-public-tls-certs\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.526184 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.538464 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-internal-tls-certs\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.540921 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9pzj\" (UniqueName: \"kubernetes.io/projected/75715e9f-e72b-4b1c-9b69-870f1766ba64-kube-api-access-f9pzj\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.542076 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-config-data\") pod \"nova-api-0\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " pod="openstack/nova-api-0" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.622095 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-config-data\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.622177 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-scripts\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.622244 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.622644 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s95kh\" (UniqueName: \"kubernetes.io/projected/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-kube-api-access-s95kh\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.626625 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-config-data\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.627134 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.639221 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-scripts\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.644202 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s95kh\" (UniqueName: \"kubernetes.io/projected/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-kube-api-access-s95kh\") pod \"nova-cell1-cell-mapping-t8d4k\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.647728 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:24 crc kubenswrapper[5184]: I0312 17:11:24.709108 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:11:25 crc kubenswrapper[5184]: I0312 17:11:25.179641 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac21ef93-e11b-4523-bc45-ca6b734b3a8d","Type":"ContainerStarted","Data":"d9ee962140b075090cf87ba33352ecb439e941b141e12b61b8832801e7f1849c"} Mar 12 17:11:25 crc kubenswrapper[5184]: I0312 17:11:25.180727 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Mar 12 17:11:25 crc kubenswrapper[5184]: I0312 17:11:25.179710 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="proxy-httpd" containerID="cri-o://d9ee962140b075090cf87ba33352ecb439e941b141e12b61b8832801e7f1849c" gracePeriod=30 Mar 12 17:11:25 crc kubenswrapper[5184]: I0312 17:11:25.179723 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="sg-core" containerID="cri-o://f2acdcc2c9a23af37b52dceeeb981a9988991051606ce8bada0dbc9f8a458c15" gracePeriod=30 Mar 12 17:11:25 crc kubenswrapper[5184]: I0312 17:11:25.179733 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="ceilometer-notification-agent" containerID="cri-o://496b71038c00ee771db0a7a91843075dc55c91e395a3eedd946e133d09b8b87e" gracePeriod=30 Mar 12 17:11:25 crc kubenswrapper[5184]: I0312 17:11:25.179664 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="ceilometer-central-agent" containerID="cri-o://ddaf4697111ce1cc4cb4051d8a0af0b01d1b66b76a5937bc3697bb5a0a7be36e" gracePeriod=30 Mar 12 17:11:25 crc kubenswrapper[5184]: I0312 17:11:25.208487 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8543855630000001 podStartE2EDuration="6.208347646s" podCreationTimestamp="2026-03-12 17:11:19 +0000 UTC" firstStartedPulling="2026-03-12 17:11:20.064975545 +0000 UTC m=+1222.606286884" lastFinishedPulling="2026-03-12 17:11:24.418937628 +0000 UTC m=+1226.960248967" observedRunningTime="2026-03-12 17:11:25.203989651 +0000 UTC m=+1227.745301010" watchObservedRunningTime="2026-03-12 17:11:25.208347646 +0000 UTC m=+1227.749658995" Mar 12 17:11:25 crc kubenswrapper[5184]: I0312 17:11:25.294000 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-t8d4k"] Mar 12 17:11:25 crc kubenswrapper[5184]: W0312 17:11:25.299479 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf7acbc2_7b1e_4156_b9d8_c4172585c2e1.slice/crio-b5309aeeb66ff7db769893e7994bcae76a55a2882ac9183cbb7310085154d274 WatchSource:0}: Error finding container b5309aeeb66ff7db769893e7994bcae76a55a2882ac9183cbb7310085154d274: Status 404 returned error can't find the container with id b5309aeeb66ff7db769893e7994bcae76a55a2882ac9183cbb7310085154d274 Mar 12 17:11:25 crc kubenswrapper[5184]: I0312 17:11:25.417662 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:25 crc kubenswrapper[5184]: W0312 17:11:25.423271 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75715e9f_e72b_4b1c_9b69_870f1766ba64.slice/crio-66fdc047827a83332755c431e201c0dee0d6cf6ff590e79e41dedd4b2eda64e0 WatchSource:0}: Error finding container 66fdc047827a83332755c431e201c0dee0d6cf6ff590e79e41dedd4b2eda64e0: Status 404 returned error can't find the container with id 66fdc047827a83332755c431e201c0dee0d6cf6ff590e79e41dedd4b2eda64e0 Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.123731 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.187919 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8458c54c8c-8c75q"] Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.189008 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t8d4k" event={"ID":"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1","Type":"ContainerStarted","Data":"3b86f8065281c734088f1750bdf4a5df82ac5321efe2e8b0e2f29801763c9c22"} Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.189119 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t8d4k" event={"ID":"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1","Type":"ContainerStarted","Data":"b5309aeeb66ff7db769893e7994bcae76a55a2882ac9183cbb7310085154d274"} Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.189504 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" podUID="051f761e-6e70-40a2-a1ac-55d668527483" containerName="dnsmasq-dns" containerID="cri-o://18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b" gracePeriod=10 Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.202550 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75715e9f-e72b-4b1c-9b69-870f1766ba64","Type":"ContainerStarted","Data":"f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4"} Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.202608 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75715e9f-e72b-4b1c-9b69-870f1766ba64","Type":"ContainerStarted","Data":"db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39"} Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.202627 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75715e9f-e72b-4b1c-9b69-870f1766ba64","Type":"ContainerStarted","Data":"66fdc047827a83332755c431e201c0dee0d6cf6ff590e79e41dedd4b2eda64e0"} Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.227285 5184 generic.go:358] "Generic (PLEG): container finished" podID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerID="d9ee962140b075090cf87ba33352ecb439e941b141e12b61b8832801e7f1849c" exitCode=0 Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.227326 5184 generic.go:358] "Generic (PLEG): container finished" podID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerID="f2acdcc2c9a23af37b52dceeeb981a9988991051606ce8bada0dbc9f8a458c15" exitCode=2 Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.227336 5184 generic.go:358] "Generic (PLEG): container finished" podID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerID="496b71038c00ee771db0a7a91843075dc55c91e395a3eedd946e133d09b8b87e" exitCode=0 Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.227623 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac21ef93-e11b-4523-bc45-ca6b734b3a8d","Type":"ContainerDied","Data":"d9ee962140b075090cf87ba33352ecb439e941b141e12b61b8832801e7f1849c"} Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.227666 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac21ef93-e11b-4523-bc45-ca6b734b3a8d","Type":"ContainerDied","Data":"f2acdcc2c9a23af37b52dceeeb981a9988991051606ce8bada0dbc9f8a458c15"} Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.227680 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac21ef93-e11b-4523-bc45-ca6b734b3a8d","Type":"ContainerDied","Data":"496b71038c00ee771db0a7a91843075dc55c91e395a3eedd946e133d09b8b87e"} Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.228808 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-t8d4k" podStartSLOduration=2.228767828 podStartE2EDuration="2.228767828s" podCreationTimestamp="2026-03-12 17:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:11:26.215615559 +0000 UTC m=+1228.756926908" watchObservedRunningTime="2026-03-12 17:11:26.228767828 +0000 UTC m=+1228.770079167" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.260582 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.260565937 podStartE2EDuration="2.260565937s" podCreationTimestamp="2026-03-12 17:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:11:26.250778973 +0000 UTC m=+1228.792090312" watchObservedRunningTime="2026-03-12 17:11:26.260565937 +0000 UTC m=+1228.801877276" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.802885 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.887529 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntccv\" (UniqueName: \"kubernetes.io/projected/051f761e-6e70-40a2-a1ac-55d668527483-kube-api-access-ntccv\") pod \"051f761e-6e70-40a2-a1ac-55d668527483\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.887576 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-nb\") pod \"051f761e-6e70-40a2-a1ac-55d668527483\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.887655 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-swift-storage-0\") pod \"051f761e-6e70-40a2-a1ac-55d668527483\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.887728 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-sb\") pod \"051f761e-6e70-40a2-a1ac-55d668527483\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.887785 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-config\") pod \"051f761e-6e70-40a2-a1ac-55d668527483\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.887811 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-svc\") pod \"051f761e-6e70-40a2-a1ac-55d668527483\" (UID: \"051f761e-6e70-40a2-a1ac-55d668527483\") " Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.896407 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/051f761e-6e70-40a2-a1ac-55d668527483-kube-api-access-ntccv" (OuterVolumeSpecName: "kube-api-access-ntccv") pod "051f761e-6e70-40a2-a1ac-55d668527483" (UID: "051f761e-6e70-40a2-a1ac-55d668527483"). InnerVolumeSpecName "kube-api-access-ntccv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.939315 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "051f761e-6e70-40a2-a1ac-55d668527483" (UID: "051f761e-6e70-40a2-a1ac-55d668527483"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.940219 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "051f761e-6e70-40a2-a1ac-55d668527483" (UID: "051f761e-6e70-40a2-a1ac-55d668527483"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.941705 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "051f761e-6e70-40a2-a1ac-55d668527483" (UID: "051f761e-6e70-40a2-a1ac-55d668527483"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.943024 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "051f761e-6e70-40a2-a1ac-55d668527483" (UID: "051f761e-6e70-40a2-a1ac-55d668527483"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.944283 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-config" (OuterVolumeSpecName: "config") pod "051f761e-6e70-40a2-a1ac-55d668527483" (UID: "051f761e-6e70-40a2-a1ac-55d668527483"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.989727 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntccv\" (UniqueName: \"kubernetes.io/projected/051f761e-6e70-40a2-a1ac-55d668527483-kube-api-access-ntccv\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.989980 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.989989 5184 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.989997 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.990006 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:26 crc kubenswrapper[5184]: I0312 17:11:26.990014 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/051f761e-6e70-40a2-a1ac-55d668527483-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.242199 5184 generic.go:358] "Generic (PLEG): container finished" podID="051f761e-6e70-40a2-a1ac-55d668527483" containerID="18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b" exitCode=0 Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.242750 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" event={"ID":"051f761e-6e70-40a2-a1ac-55d668527483","Type":"ContainerDied","Data":"18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b"} Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.243467 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" event={"ID":"051f761e-6e70-40a2-a1ac-55d668527483","Type":"ContainerDied","Data":"ad7cdfa7a667d161e6487aac140d0b0bb662df15d7aa1b4d187d47a457b03608"} Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.243551 5184 scope.go:117] "RemoveContainer" containerID="18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.242795 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8458c54c8c-8c75q" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.254349 5184 generic.go:358] "Generic (PLEG): container finished" podID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerID="ddaf4697111ce1cc4cb4051d8a0af0b01d1b66b76a5937bc3697bb5a0a7be36e" exitCode=0 Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.254689 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac21ef93-e11b-4523-bc45-ca6b734b3a8d","Type":"ContainerDied","Data":"ddaf4697111ce1cc4cb4051d8a0af0b01d1b66b76a5937bc3697bb5a0a7be36e"} Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.280478 5184 scope.go:117] "RemoveContainer" containerID="5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.302583 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8458c54c8c-8c75q"] Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.317337 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8458c54c8c-8c75q"] Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.334237 5184 scope.go:117] "RemoveContainer" containerID="18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b" Mar 12 17:11:27 crc kubenswrapper[5184]: E0312 17:11:27.334852 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b\": container with ID starting with 18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b not found: ID does not exist" containerID="18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.334891 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b"} err="failed to get container status \"18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b\": rpc error: code = NotFound desc = could not find container \"18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b\": container with ID starting with 18cdfeb1c9b8f3c0ce5b465139edc004b0b0133d44239b5e3d1d0f0f57966b9b not found: ID does not exist" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.334908 5184 scope.go:117] "RemoveContainer" containerID="5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268" Mar 12 17:11:27 crc kubenswrapper[5184]: E0312 17:11:27.335417 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268\": container with ID starting with 5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268 not found: ID does not exist" containerID="5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.335490 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268"} err="failed to get container status \"5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268\": rpc error: code = NotFound desc = could not find container \"5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268\": container with ID starting with 5644f2b28d31d310f0d59b4034dd49b4fdf04eba4ee022cdcdaff5b06136f268 not found: ID does not exist" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.536242 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.702883 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-combined-ca-bundle\") pod \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.703229 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-scripts\") pod \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.703346 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-sg-core-conf-yaml\") pod \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.703568 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-config-data\") pod \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.703599 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r55k\" (UniqueName: \"kubernetes.io/projected/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-kube-api-access-5r55k\") pod \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.703653 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-run-httpd\") pod \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.703697 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-ceilometer-tls-certs\") pod \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.703819 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-log-httpd\") pod \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\" (UID: \"ac21ef93-e11b-4523-bc45-ca6b734b3a8d\") " Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.704461 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ac21ef93-e11b-4523-bc45-ca6b734b3a8d" (UID: "ac21ef93-e11b-4523-bc45-ca6b734b3a8d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.705098 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ac21ef93-e11b-4523-bc45-ca6b734b3a8d" (UID: "ac21ef93-e11b-4523-bc45-ca6b734b3a8d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.709453 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-scripts" (OuterVolumeSpecName: "scripts") pod "ac21ef93-e11b-4523-bc45-ca6b734b3a8d" (UID: "ac21ef93-e11b-4523-bc45-ca6b734b3a8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.709606 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-kube-api-access-5r55k" (OuterVolumeSpecName: "kube-api-access-5r55k") pod "ac21ef93-e11b-4523-bc45-ca6b734b3a8d" (UID: "ac21ef93-e11b-4523-bc45-ca6b734b3a8d"). InnerVolumeSpecName "kube-api-access-5r55k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.756938 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ac21ef93-e11b-4523-bc45-ca6b734b3a8d" (UID: "ac21ef93-e11b-4523-bc45-ca6b734b3a8d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.766298 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ac21ef93-e11b-4523-bc45-ca6b734b3a8d" (UID: "ac21ef93-e11b-4523-bc45-ca6b734b3a8d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.796114 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac21ef93-e11b-4523-bc45-ca6b734b3a8d" (UID: "ac21ef93-e11b-4523-bc45-ca6b734b3a8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.806639 5184 reconciler_common.go:299] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.806677 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5r55k\" (UniqueName: \"kubernetes.io/projected/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-kube-api-access-5r55k\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.806690 5184 reconciler_common.go:299] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.806698 5184 reconciler_common.go:299] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.806708 5184 reconciler_common.go:299] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.806717 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.806725 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.844863 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-config-data" (OuterVolumeSpecName: "config-data") pod "ac21ef93-e11b-4523-bc45-ca6b734b3a8d" (UID: "ac21ef93-e11b-4523-bc45-ca6b734b3a8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:27 crc kubenswrapper[5184]: I0312 17:11:27.908965 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac21ef93-e11b-4523-bc45-ca6b734b3a8d-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.095752 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.294141 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac21ef93-e11b-4523-bc45-ca6b734b3a8d","Type":"ContainerDied","Data":"fb68dfe730a64af5736d28b624b5846ae3096e520e0b4e84d5a87c3bc7a96011"} Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.294199 5184 scope.go:117] "RemoveContainer" containerID="d9ee962140b075090cf87ba33352ecb439e941b141e12b61b8832801e7f1849c" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.294230 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.339296 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.341008 5184 scope.go:117] "RemoveContainer" containerID="f2acdcc2c9a23af37b52dceeeb981a9988991051606ce8bada0dbc9f8a458c15" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.347915 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.368254 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369531 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="ceilometer-central-agent" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369551 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="ceilometer-central-agent" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369568 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="sg-core" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369576 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="sg-core" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369621 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="051f761e-6e70-40a2-a1ac-55d668527483" containerName="init" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369630 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="051f761e-6e70-40a2-a1ac-55d668527483" containerName="init" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369646 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="ceilometer-notification-agent" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369654 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="ceilometer-notification-agent" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369667 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="051f761e-6e70-40a2-a1ac-55d668527483" containerName="dnsmasq-dns" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369674 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="051f761e-6e70-40a2-a1ac-55d668527483" containerName="dnsmasq-dns" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369705 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="proxy-httpd" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369713 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="proxy-httpd" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369913 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="sg-core" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369940 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="051f761e-6e70-40a2-a1ac-55d668527483" containerName="dnsmasq-dns" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369954 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="ceilometer-central-agent" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369964 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="ceilometer-notification-agent" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.369975 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" containerName="proxy-httpd" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.379283 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.382159 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-scripts\"" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.382506 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"ceilometer-config-data\"" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.382825 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-ceilometer-internal-svc\"" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.386557 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.398417 5184 scope.go:117] "RemoveContainer" containerID="496b71038c00ee771db0a7a91843075dc55c91e395a3eedd946e133d09b8b87e" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.419000 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="051f761e-6e70-40a2-a1ac-55d668527483" path="/var/lib/kubelet/pods/051f761e-6e70-40a2-a1ac-55d668527483/volumes" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.421192 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac21ef93-e11b-4523-bc45-ca6b734b3a8d" path="/var/lib/kubelet/pods/ac21ef93-e11b-4523-bc45-ca6b734b3a8d/volumes" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.432370 5184 scope.go:117] "RemoveContainer" containerID="ddaf4697111ce1cc4cb4051d8a0af0b01d1b66b76a5937bc3697bb5a0a7be36e" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.524055 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.524111 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-scripts\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.524133 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-config-data\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.524153 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45b25746-82f1-4bdf-8246-2f8ef1514dba-run-httpd\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.524254 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.524427 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.524455 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52l6\" (UniqueName: \"kubernetes.io/projected/45b25746-82f1-4bdf-8246-2f8ef1514dba-kube-api-access-d52l6\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.524550 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45b25746-82f1-4bdf-8246-2f8ef1514dba-log-httpd\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.626356 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.626432 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d52l6\" (UniqueName: \"kubernetes.io/projected/45b25746-82f1-4bdf-8246-2f8ef1514dba-kube-api-access-d52l6\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.626527 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45b25746-82f1-4bdf-8246-2f8ef1514dba-log-httpd\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.626592 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.626626 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-scripts\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.626653 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-config-data\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.626827 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45b25746-82f1-4bdf-8246-2f8ef1514dba-run-httpd\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.626894 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.628123 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45b25746-82f1-4bdf-8246-2f8ef1514dba-log-httpd\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.629836 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45b25746-82f1-4bdf-8246-2f8ef1514dba-run-httpd\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.633399 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.633745 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.633973 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.634241 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-config-data\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.636698 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45b25746-82f1-4bdf-8246-2f8ef1514dba-scripts\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.650547 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d52l6\" (UniqueName: \"kubernetes.io/projected/45b25746-82f1-4bdf-8246-2f8ef1514dba-kube-api-access-d52l6\") pod \"ceilometer-0\" (UID: \"45b25746-82f1-4bdf-8246-2f8ef1514dba\") " pod="openstack/ceilometer-0" Mar 12 17:11:28 crc kubenswrapper[5184]: I0312 17:11:28.728293 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 17:11:29 crc kubenswrapper[5184]: W0312 17:11:29.225732 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45b25746_82f1_4bdf_8246_2f8ef1514dba.slice/crio-d680be5d4dca65456daa913b55bb1a79f7027bb94e109a40eba9d413ec3fc1c6 WatchSource:0}: Error finding container d680be5d4dca65456daa913b55bb1a79f7027bb94e109a40eba9d413ec3fc1c6: Status 404 returned error can't find the container with id d680be5d4dca65456daa913b55bb1a79f7027bb94e109a40eba9d413ec3fc1c6 Mar 12 17:11:29 crc kubenswrapper[5184]: I0312 17:11:29.226244 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 17:11:29 crc kubenswrapper[5184]: I0312 17:11:29.306837 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45b25746-82f1-4bdf-8246-2f8ef1514dba","Type":"ContainerStarted","Data":"d680be5d4dca65456daa913b55bb1a79f7027bb94e109a40eba9d413ec3fc1c6"} Mar 12 17:11:30 crc kubenswrapper[5184]: I0312 17:11:30.319893 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45b25746-82f1-4bdf-8246-2f8ef1514dba","Type":"ContainerStarted","Data":"a710797c68ffa69f27ed327cf7408abc9481764e183e3e5b9cadebad07344092"} Mar 12 17:11:31 crc kubenswrapper[5184]: I0312 17:11:31.330598 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45b25746-82f1-4bdf-8246-2f8ef1514dba","Type":"ContainerStarted","Data":"5388e22de40c28661307ebcb7f9d2a65ee6c0d70971c56d842a600ae042f3453"} Mar 12 17:11:31 crc kubenswrapper[5184]: I0312 17:11:31.332184 5184 generic.go:358] "Generic (PLEG): container finished" podID="cf7acbc2-7b1e-4156-b9d8-c4172585c2e1" containerID="3b86f8065281c734088f1750bdf4a5df82ac5321efe2e8b0e2f29801763c9c22" exitCode=0 Mar 12 17:11:31 crc kubenswrapper[5184]: I0312 17:11:31.332313 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t8d4k" event={"ID":"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1","Type":"ContainerDied","Data":"3b86f8065281c734088f1750bdf4a5df82ac5321efe2e8b0e2f29801763c9c22"} Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.343243 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45b25746-82f1-4bdf-8246-2f8ef1514dba","Type":"ContainerStarted","Data":"342dc66d42d562422aa47f425d33ca2137f34c00d1333ff56d20bf0d6fbae716"} Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.754864 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.809347 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-config-data\") pod \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.809939 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s95kh\" (UniqueName: \"kubernetes.io/projected/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-kube-api-access-s95kh\") pod \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.810153 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-scripts\") pod \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.810327 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-combined-ca-bundle\") pod \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\" (UID: \"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1\") " Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.819193 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-kube-api-access-s95kh" (OuterVolumeSpecName: "kube-api-access-s95kh") pod "cf7acbc2-7b1e-4156-b9d8-c4172585c2e1" (UID: "cf7acbc2-7b1e-4156-b9d8-c4172585c2e1"). InnerVolumeSpecName "kube-api-access-s95kh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.828766 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-scripts" (OuterVolumeSpecName: "scripts") pod "cf7acbc2-7b1e-4156-b9d8-c4172585c2e1" (UID: "cf7acbc2-7b1e-4156-b9d8-c4172585c2e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.848688 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf7acbc2-7b1e-4156-b9d8-c4172585c2e1" (UID: "cf7acbc2-7b1e-4156-b9d8-c4172585c2e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.860771 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-config-data" (OuterVolumeSpecName: "config-data") pod "cf7acbc2-7b1e-4156-b9d8-c4172585c2e1" (UID: "cf7acbc2-7b1e-4156-b9d8-c4172585c2e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.913391 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.913434 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.913448 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s95kh\" (UniqueName: \"kubernetes.io/projected/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-kube-api-access-s95kh\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:32 crc kubenswrapper[5184]: I0312 17:11:32.913462 5184 reconciler_common.go:299] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.361190 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t8d4k" Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.361991 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t8d4k" event={"ID":"cf7acbc2-7b1e-4156-b9d8-c4172585c2e1","Type":"ContainerDied","Data":"b5309aeeb66ff7db769893e7994bcae76a55a2882ac9183cbb7310085154d274"} Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.365813 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5309aeeb66ff7db769893e7994bcae76a55a2882ac9183cbb7310085154d274" Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.548520 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.549542 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="75715e9f-e72b-4b1c-9b69-870f1766ba64" containerName="nova-api-log" containerID="cri-o://db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39" gracePeriod=30 Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.550087 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="75715e9f-e72b-4b1c-9b69-870f1766ba64" containerName="nova-api-api" containerID="cri-o://f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4" gracePeriod=30 Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.583712 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.583923 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ff1db6e5-9786-448c-bb54-82eb3ea089a6" containerName="nova-scheduler-scheduler" containerID="cri-o://25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6" gracePeriod=30 Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.643462 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.643718 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-log" containerID="cri-o://1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e" gracePeriod=30 Mar 12 17:11:33 crc kubenswrapper[5184]: I0312 17:11:33.643857 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-metadata" containerID="cri-o://137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680" gracePeriod=30 Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.192665 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.241609 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75715e9f-e72b-4b1c-9b69-870f1766ba64-logs\") pod \"75715e9f-e72b-4b1c-9b69-870f1766ba64\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.241649 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-config-data\") pod \"75715e9f-e72b-4b1c-9b69-870f1766ba64\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.241735 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9pzj\" (UniqueName: \"kubernetes.io/projected/75715e9f-e72b-4b1c-9b69-870f1766ba64-kube-api-access-f9pzj\") pod \"75715e9f-e72b-4b1c-9b69-870f1766ba64\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.241806 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-public-tls-certs\") pod \"75715e9f-e72b-4b1c-9b69-870f1766ba64\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.242221 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75715e9f-e72b-4b1c-9b69-870f1766ba64-logs" (OuterVolumeSpecName: "logs") pod "75715e9f-e72b-4b1c-9b69-870f1766ba64" (UID: "75715e9f-e72b-4b1c-9b69-870f1766ba64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.242634 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-combined-ca-bundle\") pod \"75715e9f-e72b-4b1c-9b69-870f1766ba64\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.242781 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-internal-tls-certs\") pod \"75715e9f-e72b-4b1c-9b69-870f1766ba64\" (UID: \"75715e9f-e72b-4b1c-9b69-870f1766ba64\") " Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.243852 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75715e9f-e72b-4b1c-9b69-870f1766ba64-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.257612 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75715e9f-e72b-4b1c-9b69-870f1766ba64-kube-api-access-f9pzj" (OuterVolumeSpecName: "kube-api-access-f9pzj") pod "75715e9f-e72b-4b1c-9b69-870f1766ba64" (UID: "75715e9f-e72b-4b1c-9b69-870f1766ba64"). InnerVolumeSpecName "kube-api-access-f9pzj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.287278 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75715e9f-e72b-4b1c-9b69-870f1766ba64" (UID: "75715e9f-e72b-4b1c-9b69-870f1766ba64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.290529 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-config-data" (OuterVolumeSpecName: "config-data") pod "75715e9f-e72b-4b1c-9b69-870f1766ba64" (UID: "75715e9f-e72b-4b1c-9b69-870f1766ba64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.311592 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "75715e9f-e72b-4b1c-9b69-870f1766ba64" (UID: "75715e9f-e72b-4b1c-9b69-870f1766ba64"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.345033 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.345065 5184 reconciler_common.go:299] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.345075 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.345283 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f9pzj\" (UniqueName: \"kubernetes.io/projected/75715e9f-e72b-4b1c-9b69-870f1766ba64-kube-api-access-f9pzj\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.359773 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "75715e9f-e72b-4b1c-9b69-870f1766ba64" (UID: "75715e9f-e72b-4b1c-9b69-870f1766ba64"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.377103 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerID="1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e" exitCode=143 Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.377274 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a0286e4-4295-4526-bcf3-f003b4766dec","Type":"ContainerDied","Data":"1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e"} Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.379878 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75715e9f-e72b-4b1c-9b69-870f1766ba64","Type":"ContainerDied","Data":"f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4"} Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.379899 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.379938 5184 scope.go:117] "RemoveContainer" containerID="f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.383837 5184 generic.go:358] "Generic (PLEG): container finished" podID="75715e9f-e72b-4b1c-9b69-870f1766ba64" containerID="f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4" exitCode=0 Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.383871 5184 generic.go:358] "Generic (PLEG): container finished" podID="75715e9f-e72b-4b1c-9b69-870f1766ba64" containerID="db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39" exitCode=143 Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.384844 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75715e9f-e72b-4b1c-9b69-870f1766ba64","Type":"ContainerDied","Data":"db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39"} Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.384987 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75715e9f-e72b-4b1c-9b69-870f1766ba64","Type":"ContainerDied","Data":"66fdc047827a83332755c431e201c0dee0d6cf6ff590e79e41dedd4b2eda64e0"} Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.394180 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"45b25746-82f1-4bdf-8246-2f8ef1514dba","Type":"ContainerStarted","Data":"862a54026a89804018f17f5936230716f22d805ce4eb310ae291cc77eb000ce8"} Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.394262 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/ceilometer-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.409008 5184 scope.go:117] "RemoveContainer" containerID="db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.421259 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.175001641 podStartE2EDuration="6.421241825s" podCreationTimestamp="2026-03-12 17:11:28 +0000 UTC" firstStartedPulling="2026-03-12 17:11:29.227834218 +0000 UTC m=+1231.769145557" lastFinishedPulling="2026-03-12 17:11:33.474074402 +0000 UTC m=+1236.015385741" observedRunningTime="2026-03-12 17:11:34.415474446 +0000 UTC m=+1236.956785785" watchObservedRunningTime="2026-03-12 17:11:34.421241825 +0000 UTC m=+1236.962553164" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.450019 5184 reconciler_common.go:299] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75715e9f-e72b-4b1c-9b69-870f1766ba64-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.450692 5184 scope.go:117] "RemoveContainer" containerID="f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.454097 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:34 crc kubenswrapper[5184]: E0312 17:11:34.454788 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4\": container with ID starting with f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4 not found: ID does not exist" containerID="f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.454831 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4"} err="failed to get container status \"f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4\": rpc error: code = NotFound desc = could not find container \"f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4\": container with ID starting with f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4 not found: ID does not exist" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.454857 5184 scope.go:117] "RemoveContainer" containerID="db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39" Mar 12 17:11:34 crc kubenswrapper[5184]: E0312 17:11:34.457680 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39\": container with ID starting with db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39 not found: ID does not exist" containerID="db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.457723 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39"} err="failed to get container status \"db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39\": rpc error: code = NotFound desc = could not find container \"db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39\": container with ID starting with db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39 not found: ID does not exist" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.457749 5184 scope.go:117] "RemoveContainer" containerID="f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.458130 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4"} err="failed to get container status \"f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4\": rpc error: code = NotFound desc = could not find container \"f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4\": container with ID starting with f01cc0ac68f9e0487524b117b0f9210c98435e61dcf1c57e3836cc55d831c5d4 not found: ID does not exist" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.458154 5184 scope.go:117] "RemoveContainer" containerID="db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.458397 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39"} err="failed to get container status \"db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39\": rpc error: code = NotFound desc = could not find container \"db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39\": container with ID starting with db13cca6877658091d981a44208d6715ba2e2e4644ccb3f33aaac650ccb32d39 not found: ID does not exist" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.462337 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.482645 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.483811 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf7acbc2-7b1e-4156-b9d8-c4172585c2e1" containerName="nova-manage" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.483831 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7acbc2-7b1e-4156-b9d8-c4172585c2e1" containerName="nova-manage" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.483841 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75715e9f-e72b-4b1c-9b69-870f1766ba64" containerName="nova-api-api" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.483846 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="75715e9f-e72b-4b1c-9b69-870f1766ba64" containerName="nova-api-api" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.483857 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75715e9f-e72b-4b1c-9b69-870f1766ba64" containerName="nova-api-log" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.483863 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="75715e9f-e72b-4b1c-9b69-870f1766ba64" containerName="nova-api-log" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.484079 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="75715e9f-e72b-4b1c-9b69-870f1766ba64" containerName="nova-api-api" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.484091 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf7acbc2-7b1e-4156-b9d8-c4172585c2e1" containerName="nova-manage" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.484098 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="75715e9f-e72b-4b1c-9b69-870f1766ba64" containerName="nova-api-log" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.491905 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.492164 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.494524 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-internal-svc\"" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.495181 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-public-svc\"" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.495235 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-api-config-data\"" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.551809 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.551885 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.552002 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56d8fe13-8a18-4a96-bf19-21711cd1d931-logs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.552030 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-public-tls-certs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.552188 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-config-data\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.552260 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9jzs\" (UniqueName: \"kubernetes.io/projected/56d8fe13-8a18-4a96-bf19-21711cd1d931-kube-api-access-z9jzs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.654689 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-config-data\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.656054 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9jzs\" (UniqueName: \"kubernetes.io/projected/56d8fe13-8a18-4a96-bf19-21711cd1d931-kube-api-access-z9jzs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.656249 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.656334 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.656557 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56d8fe13-8a18-4a96-bf19-21711cd1d931-logs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.656646 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-public-tls-certs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.657134 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56d8fe13-8a18-4a96-bf19-21711cd1d931-logs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.660658 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.660976 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-config-data\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.662615 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.664016 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56d8fe13-8a18-4a96-bf19-21711cd1d931-public-tls-certs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.678236 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9jzs\" (UniqueName: \"kubernetes.io/projected/56d8fe13-8a18-4a96-bf19-21711cd1d931-kube-api-access-z9jzs\") pod \"nova-api-0\" (UID: \"56d8fe13-8a18-4a96-bf19-21711cd1d931\") " pod="openstack/nova-api-0" Mar 12 17:11:34 crc kubenswrapper[5184]: I0312 17:11:34.809276 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 17:11:35 crc kubenswrapper[5184]: I0312 17:11:35.363296 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 17:11:35 crc kubenswrapper[5184]: I0312 17:11:35.405779 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56d8fe13-8a18-4a96-bf19-21711cd1d931","Type":"ContainerStarted","Data":"b852bbcfde6ae381b72cd23841af7caf820050a5aaf6df629d09f466a61ad340"} Mar 12 17:11:35 crc kubenswrapper[5184]: E0312 17:11:35.889755 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 17:11:35 crc kubenswrapper[5184]: E0312 17:11:35.892545 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 17:11:35 crc kubenswrapper[5184]: E0312 17:11:35.894148 5184 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 17:11:35 crc kubenswrapper[5184]: E0312 17:11:35.894184 5184 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ff1db6e5-9786-448c-bb54-82eb3ea089a6" containerName="nova-scheduler-scheduler" probeResult="unknown" Mar 12 17:11:36 crc kubenswrapper[5184]: I0312 17:11:36.424507 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75715e9f-e72b-4b1c-9b69-870f1766ba64" path="/var/lib/kubelet/pods/75715e9f-e72b-4b1c-9b69-870f1766ba64/volumes" Mar 12 17:11:36 crc kubenswrapper[5184]: I0312 17:11:36.426481 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56d8fe13-8a18-4a96-bf19-21711cd1d931","Type":"ContainerStarted","Data":"cf98d117210f4aa8c96ee0aca5180110c01ea47f0f68c0f453ba313b94976fbf"} Mar 12 17:11:36 crc kubenswrapper[5184]: I0312 17:11:36.426530 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56d8fe13-8a18-4a96-bf19-21711cd1d931","Type":"ContainerStarted","Data":"5866e801230293737c401ec67bf113e4cb67c4a61244cf2b5fe2c857e4d88ea1"} Mar 12 17:11:36 crc kubenswrapper[5184]: I0312 17:11:36.458873 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.458847939 podStartE2EDuration="2.458847939s" podCreationTimestamp="2026-03-12 17:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:11:36.449130686 +0000 UTC m=+1238.990442065" watchObservedRunningTime="2026-03-12 17:11:36.458847939 +0000 UTC m=+1239.000159288" Mar 12 17:11:36 crc kubenswrapper[5184]: I0312 17:11:36.835022 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:56808->10.217.0.199:8775: read: connection reset by peer" Mar 12 17:11:36 crc kubenswrapper[5184]: I0312 17:11:36.959767 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": dial tcp 10.217.0.199:8775: connect: connection refused" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.347691 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.421059 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-combined-ca-bundle\") pod \"3a0286e4-4295-4526-bcf3-f003b4766dec\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.421136 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-config-data\") pod \"3a0286e4-4295-4526-bcf3-f003b4766dec\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.421214 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a0286e4-4295-4526-bcf3-f003b4766dec-logs\") pod \"3a0286e4-4295-4526-bcf3-f003b4766dec\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.421413 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-nova-metadata-tls-certs\") pod \"3a0286e4-4295-4526-bcf3-f003b4766dec\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.421605 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqrlw\" (UniqueName: \"kubernetes.io/projected/3a0286e4-4295-4526-bcf3-f003b4766dec-kube-api-access-wqrlw\") pod \"3a0286e4-4295-4526-bcf3-f003b4766dec\" (UID: \"3a0286e4-4295-4526-bcf3-f003b4766dec\") " Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.422002 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a0286e4-4295-4526-bcf3-f003b4766dec-logs" (OuterVolumeSpecName: "logs") pod "3a0286e4-4295-4526-bcf3-f003b4766dec" (UID: "3a0286e4-4295-4526-bcf3-f003b4766dec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.430851 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a0286e4-4295-4526-bcf3-f003b4766dec-kube-api-access-wqrlw" (OuterVolumeSpecName: "kube-api-access-wqrlw") pod "3a0286e4-4295-4526-bcf3-f003b4766dec" (UID: "3a0286e4-4295-4526-bcf3-f003b4766dec"). InnerVolumeSpecName "kube-api-access-wqrlw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.446051 5184 generic.go:358] "Generic (PLEG): container finished" podID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerID="137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680" exitCode=0 Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.446548 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.446547 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a0286e4-4295-4526-bcf3-f003b4766dec","Type":"ContainerDied","Data":"137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680"} Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.446603 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a0286e4-4295-4526-bcf3-f003b4766dec","Type":"ContainerDied","Data":"35033e3783a6e00d5059ac920d67e51d4d35d43d624714707d26bd3287fc0254"} Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.446644 5184 scope.go:117] "RemoveContainer" containerID="137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.453854 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-config-data" (OuterVolumeSpecName: "config-data") pod "3a0286e4-4295-4526-bcf3-f003b4766dec" (UID: "3a0286e4-4295-4526-bcf3-f003b4766dec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.459176 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a0286e4-4295-4526-bcf3-f003b4766dec" (UID: "3a0286e4-4295-4526-bcf3-f003b4766dec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.511116 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3a0286e4-4295-4526-bcf3-f003b4766dec" (UID: "3a0286e4-4295-4526-bcf3-f003b4766dec"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.525242 5184 reconciler_common.go:299] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.525272 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqrlw\" (UniqueName: \"kubernetes.io/projected/3a0286e4-4295-4526-bcf3-f003b4766dec-kube-api-access-wqrlw\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.525281 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.525290 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a0286e4-4295-4526-bcf3-f003b4766dec-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.525298 5184 reconciler_common.go:299] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a0286e4-4295-4526-bcf3-f003b4766dec-logs\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.541723 5184 scope.go:117] "RemoveContainer" containerID="1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.573489 5184 scope.go:117] "RemoveContainer" containerID="137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680" Mar 12 17:11:37 crc kubenswrapper[5184]: E0312 17:11:37.573897 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680\": container with ID starting with 137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680 not found: ID does not exist" containerID="137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.573959 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680"} err="failed to get container status \"137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680\": rpc error: code = NotFound desc = could not find container \"137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680\": container with ID starting with 137f63f99f6dd3dcd9f68b5f5f59922c09a041048adb62de5102f87b91d6c680 not found: ID does not exist" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.573991 5184 scope.go:117] "RemoveContainer" containerID="1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e" Mar 12 17:11:37 crc kubenswrapper[5184]: E0312 17:11:37.574357 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e\": container with ID starting with 1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e not found: ID does not exist" containerID="1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.574428 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e"} err="failed to get container status \"1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e\": rpc error: code = NotFound desc = could not find container \"1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e\": container with ID starting with 1c6a9151a5b688ce8b550810af6ac9fc7b0c5c4db4296075e66e5b1d2d8e708e not found: ID does not exist" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.795829 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.825004 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.855655 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.857083 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-log" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.857103 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-log" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.857142 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-metadata" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.857148 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-metadata" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.857322 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-log" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.857347 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" containerName="nova-metadata-metadata" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.867651 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.878395 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-nova-metadata-internal-svc\"" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.878525 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.885259 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-metadata-config-data\"" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.932950 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8950bf1-dd92-4d86-be29-29f807a65ee1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.933098 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8950bf1-dd92-4d86-be29-29f807a65ee1-config-data\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.933313 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8950bf1-dd92-4d86-be29-29f807a65ee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.933568 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5txbw\" (UniqueName: \"kubernetes.io/projected/e8950bf1-dd92-4d86-be29-29f807a65ee1-kube-api-access-5txbw\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:37 crc kubenswrapper[5184]: I0312 17:11:37.933661 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8950bf1-dd92-4d86-be29-29f807a65ee1-logs\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: E0312 17:11:38.031871 5184 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a0286e4_4295_4526_bcf3_f003b4766dec.slice/crio-35033e3783a6e00d5059ac920d67e51d4d35d43d624714707d26bd3287fc0254\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a0286e4_4295_4526_bcf3_f003b4766dec.slice\": RecentStats: unable to find data in memory cache]" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.037413 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5txbw\" (UniqueName: \"kubernetes.io/projected/e8950bf1-dd92-4d86-be29-29f807a65ee1-kube-api-access-5txbw\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.037466 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8950bf1-dd92-4d86-be29-29f807a65ee1-logs\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.037519 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8950bf1-dd92-4d86-be29-29f807a65ee1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.037549 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8950bf1-dd92-4d86-be29-29f807a65ee1-config-data\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.037757 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8950bf1-dd92-4d86-be29-29f807a65ee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.038009 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8950bf1-dd92-4d86-be29-29f807a65ee1-logs\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.051289 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8950bf1-dd92-4d86-be29-29f807a65ee1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.051661 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8950bf1-dd92-4d86-be29-29f807a65ee1-config-data\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.051824 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8950bf1-dd92-4d86-be29-29f807a65ee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.057514 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5txbw\" (UniqueName: \"kubernetes.io/projected/e8950bf1-dd92-4d86-be29-29f807a65ee1-kube-api-access-5txbw\") pod \"nova-metadata-0\" (UID: \"e8950bf1-dd92-4d86-be29-29f807a65ee1\") " pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.190878 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.412969 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a0286e4-4295-4526-bcf3-f003b4766dec" path="/var/lib/kubelet/pods/3a0286e4-4295-4526-bcf3-f003b4766dec/volumes" Mar 12 17:11:38 crc kubenswrapper[5184]: I0312 17:11:38.678554 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.094497 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.160420 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lf57\" (UniqueName: \"kubernetes.io/projected/ff1db6e5-9786-448c-bb54-82eb3ea089a6-kube-api-access-5lf57\") pod \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.160852 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-combined-ca-bundle\") pod \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.161109 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-config-data\") pod \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\" (UID: \"ff1db6e5-9786-448c-bb54-82eb3ea089a6\") " Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.167733 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1db6e5-9786-448c-bb54-82eb3ea089a6-kube-api-access-5lf57" (OuterVolumeSpecName: "kube-api-access-5lf57") pod "ff1db6e5-9786-448c-bb54-82eb3ea089a6" (UID: "ff1db6e5-9786-448c-bb54-82eb3ea089a6"). InnerVolumeSpecName "kube-api-access-5lf57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.216052 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-config-data" (OuterVolumeSpecName: "config-data") pod "ff1db6e5-9786-448c-bb54-82eb3ea089a6" (UID: "ff1db6e5-9786-448c-bb54-82eb3ea089a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.216840 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff1db6e5-9786-448c-bb54-82eb3ea089a6" (UID: "ff1db6e5-9786-448c-bb54-82eb3ea089a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.262929 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5lf57\" (UniqueName: \"kubernetes.io/projected/ff1db6e5-9786-448c-bb54-82eb3ea089a6-kube-api-access-5lf57\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.262962 5184 reconciler_common.go:299] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.262971 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1db6e5-9786-448c-bb54-82eb3ea089a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.467065 5184 generic.go:358] "Generic (PLEG): container finished" podID="ff1db6e5-9786-448c-bb54-82eb3ea089a6" containerID="25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6" exitCode=0 Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.467106 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff1db6e5-9786-448c-bb54-82eb3ea089a6","Type":"ContainerDied","Data":"25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6"} Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.467153 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ff1db6e5-9786-448c-bb54-82eb3ea089a6","Type":"ContainerDied","Data":"2297395eca0112918c9a4fc740dc0fd23ef62316ed60d953e911185a76858943"} Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.467170 5184 scope.go:117] "RemoveContainer" containerID="25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.467132 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.469441 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8950bf1-dd92-4d86-be29-29f807a65ee1","Type":"ContainerStarted","Data":"f71ccb6c90820b90c7b39f773b7b74684d2de848a74581b7b638e3339ef15c96"} Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.469476 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8950bf1-dd92-4d86-be29-29f807a65ee1","Type":"ContainerStarted","Data":"25dc33fbbb4e573af66bba612d783cf05468849ebc00f324f38fab39ce9f0fb2"} Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.469488 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8950bf1-dd92-4d86-be29-29f807a65ee1","Type":"ContainerStarted","Data":"693bee18c05e305dde5b4b2094accea250e5167a6429e44adc369f432e139112"} Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.493161 5184 scope.go:117] "RemoveContainer" containerID="25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6" Mar 12 17:11:39 crc kubenswrapper[5184]: E0312 17:11:39.493692 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6\": container with ID starting with 25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6 not found: ID does not exist" containerID="25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.493733 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6"} err="failed to get container status \"25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6\": rpc error: code = NotFound desc = could not find container \"25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6\": container with ID starting with 25a26e94050e8d02eb0980ea642e3c7bb251529c36792798ffa5eff13d5843b6 not found: ID does not exist" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.497766 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.497741447 podStartE2EDuration="2.497741447s" podCreationTimestamp="2026-03-12 17:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:11:39.490709999 +0000 UTC m=+1242.032021338" watchObservedRunningTime="2026-03-12 17:11:39.497741447 +0000 UTC m=+1242.039052786" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.511696 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.534445 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.553975 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.555248 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff1db6e5-9786-448c-bb54-82eb3ea089a6" containerName="nova-scheduler-scheduler" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.555272 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1db6e5-9786-448c-bb54-82eb3ea089a6" containerName="nova-scheduler-scheduler" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.555500 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff1db6e5-9786-448c-bb54-82eb3ea089a6" containerName="nova-scheduler-scheduler" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.568986 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.569124 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.570994 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"nova-scheduler-config-data\"" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.714591 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dbb540e-524b-4e3c-b2b0-b7019085f4ae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7dbb540e-524b-4e3c-b2b0-b7019085f4ae\") " pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.714803 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fwsh\" (UniqueName: \"kubernetes.io/projected/7dbb540e-524b-4e3c-b2b0-b7019085f4ae-kube-api-access-2fwsh\") pod \"nova-scheduler-0\" (UID: \"7dbb540e-524b-4e3c-b2b0-b7019085f4ae\") " pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.714873 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dbb540e-524b-4e3c-b2b0-b7019085f4ae-config-data\") pod \"nova-scheduler-0\" (UID: \"7dbb540e-524b-4e3c-b2b0-b7019085f4ae\") " pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.817400 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dbb540e-524b-4e3c-b2b0-b7019085f4ae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7dbb540e-524b-4e3c-b2b0-b7019085f4ae\") " pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.817487 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fwsh\" (UniqueName: \"kubernetes.io/projected/7dbb540e-524b-4e3c-b2b0-b7019085f4ae-kube-api-access-2fwsh\") pod \"nova-scheduler-0\" (UID: \"7dbb540e-524b-4e3c-b2b0-b7019085f4ae\") " pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.817647 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dbb540e-524b-4e3c-b2b0-b7019085f4ae-config-data\") pod \"nova-scheduler-0\" (UID: \"7dbb540e-524b-4e3c-b2b0-b7019085f4ae\") " pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.821853 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dbb540e-524b-4e3c-b2b0-b7019085f4ae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7dbb540e-524b-4e3c-b2b0-b7019085f4ae\") " pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.823521 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dbb540e-524b-4e3c-b2b0-b7019085f4ae-config-data\") pod \"nova-scheduler-0\" (UID: \"7dbb540e-524b-4e3c-b2b0-b7019085f4ae\") " pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.833847 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fwsh\" (UniqueName: \"kubernetes.io/projected/7dbb540e-524b-4e3c-b2b0-b7019085f4ae-kube-api-access-2fwsh\") pod \"nova-scheduler-0\" (UID: \"7dbb540e-524b-4e3c-b2b0-b7019085f4ae\") " pod="openstack/nova-scheduler-0" Mar 12 17:11:39 crc kubenswrapper[5184]: I0312 17:11:39.924326 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 17:11:40 crc kubenswrapper[5184]: I0312 17:11:40.212161 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 17:11:40 crc kubenswrapper[5184]: W0312 17:11:40.217316 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dbb540e_524b_4e3c_b2b0_b7019085f4ae.slice/crio-c228fac5dab50567e9807fddca8ec435c300702c7fee60401815aa65f721006d WatchSource:0}: Error finding container c228fac5dab50567e9807fddca8ec435c300702c7fee60401815aa65f721006d: Status 404 returned error can't find the container with id c228fac5dab50567e9807fddca8ec435c300702c7fee60401815aa65f721006d Mar 12 17:11:40 crc kubenswrapper[5184]: I0312 17:11:40.414799 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1db6e5-9786-448c-bb54-82eb3ea089a6" path="/var/lib/kubelet/pods/ff1db6e5-9786-448c-bb54-82eb3ea089a6/volumes" Mar 12 17:11:40 crc kubenswrapper[5184]: I0312 17:11:40.484989 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7dbb540e-524b-4e3c-b2b0-b7019085f4ae","Type":"ContainerStarted","Data":"c228fac5dab50567e9807fddca8ec435c300702c7fee60401815aa65f721006d"} Mar 12 17:11:41 crc kubenswrapper[5184]: I0312 17:11:41.500641 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7dbb540e-524b-4e3c-b2b0-b7019085f4ae","Type":"ContainerStarted","Data":"8124b8172cd914054a7e80a804a62d054e9d0c470b39b81750ec30b2e2939ac2"} Mar 12 17:11:41 crc kubenswrapper[5184]: I0312 17:11:41.525588 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.525462212 podStartE2EDuration="2.525462212s" podCreationTimestamp="2026-03-12 17:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:11:41.514757479 +0000 UTC m=+1244.056068848" watchObservedRunningTime="2026-03-12 17:11:41.525462212 +0000 UTC m=+1244.066773591" Mar 12 17:11:43 crc kubenswrapper[5184]: I0312 17:11:43.191071 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Mar 12 17:11:43 crc kubenswrapper[5184]: I0312 17:11:43.191573 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-metadata-0" Mar 12 17:11:44 crc kubenswrapper[5184]: I0312 17:11:44.810268 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 17:11:44 crc kubenswrapper[5184]: I0312 17:11:44.810353 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 17:11:44 crc kubenswrapper[5184]: I0312 17:11:44.925170 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-scheduler-0" Mar 12 17:11:45 crc kubenswrapper[5184]: I0312 17:11:45.819538 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56d8fe13-8a18-4a96-bf19-21711cd1d931" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 17:11:45 crc kubenswrapper[5184]: I0312 17:11:45.820654 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56d8fe13-8a18-4a96-bf19-21711cd1d931" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 17:11:48 crc kubenswrapper[5184]: I0312 17:11:48.191011 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 17:11:48 crc kubenswrapper[5184]: I0312 17:11:48.191801 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 17:11:49 crc kubenswrapper[5184]: E0312 17:11:49.115370 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1784569 actualBytes=10240 Mar 12 17:11:49 crc kubenswrapper[5184]: I0312 17:11:49.196594 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e8950bf1-dd92-4d86-be29-29f807a65ee1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 17:11:49 crc kubenswrapper[5184]: I0312 17:11:49.197579 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e8950bf1-dd92-4d86-be29-29f807a65ee1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 17:11:49 crc kubenswrapper[5184]: I0312 17:11:49.924804 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 17:11:49 crc kubenswrapper[5184]: I0312 17:11:49.964832 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 17:11:50 crc kubenswrapper[5184]: I0312 17:11:50.664615 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 17:11:54 crc kubenswrapper[5184]: I0312 17:11:54.822511 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 17:11:54 crc kubenswrapper[5184]: I0312 17:11:54.823172 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 17:11:54 crc kubenswrapper[5184]: I0312 17:11:54.824544 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Mar 12 17:11:54 crc kubenswrapper[5184]: I0312 17:11:54.824778 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/nova-api-0" Mar 12 17:11:54 crc kubenswrapper[5184]: I0312 17:11:54.838845 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 17:11:54 crc kubenswrapper[5184]: I0312 17:11:54.840096 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 17:11:58 crc kubenswrapper[5184]: I0312 17:11:58.198072 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 17:11:58 crc kubenswrapper[5184]: I0312 17:11:58.200932 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 17:11:58 crc kubenswrapper[5184]: I0312 17:11:58.205977 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 17:11:58 crc kubenswrapper[5184]: I0312 17:11:58.710573 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 17:12:00 crc kubenswrapper[5184]: I0312 17:12:00.142163 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555592-s9qm5"] Mar 12 17:12:00 crc kubenswrapper[5184]: I0312 17:12:00.148980 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555592-s9qm5" Mar 12 17:12:00 crc kubenswrapper[5184]: I0312 17:12:00.150596 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555592-s9qm5"] Mar 12 17:12:00 crc kubenswrapper[5184]: I0312 17:12:00.151578 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:12:00 crc kubenswrapper[5184]: I0312 17:12:00.151823 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:12:00 crc kubenswrapper[5184]: I0312 17:12:00.151832 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:12:00 crc kubenswrapper[5184]: I0312 17:12:00.300289 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dmqp\" (UniqueName: \"kubernetes.io/projected/464abcc7-9e49-4fff-8dbd-c4ce18f54bb8-kube-api-access-2dmqp\") pod \"auto-csr-approver-29555592-s9qm5\" (UID: \"464abcc7-9e49-4fff-8dbd-c4ce18f54bb8\") " pod="openshift-infra/auto-csr-approver-29555592-s9qm5" Mar 12 17:12:00 crc kubenswrapper[5184]: I0312 17:12:00.402153 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dmqp\" (UniqueName: \"kubernetes.io/projected/464abcc7-9e49-4fff-8dbd-c4ce18f54bb8-kube-api-access-2dmqp\") pod \"auto-csr-approver-29555592-s9qm5\" (UID: \"464abcc7-9e49-4fff-8dbd-c4ce18f54bb8\") " pod="openshift-infra/auto-csr-approver-29555592-s9qm5" Mar 12 17:12:00 crc kubenswrapper[5184]: I0312 17:12:00.426219 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dmqp\" (UniqueName: \"kubernetes.io/projected/464abcc7-9e49-4fff-8dbd-c4ce18f54bb8-kube-api-access-2dmqp\") pod \"auto-csr-approver-29555592-s9qm5\" (UID: \"464abcc7-9e49-4fff-8dbd-c4ce18f54bb8\") " pod="openshift-infra/auto-csr-approver-29555592-s9qm5" Mar 12 17:12:00 crc kubenswrapper[5184]: I0312 17:12:00.472959 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555592-s9qm5" Mar 12 17:12:01 crc kubenswrapper[5184]: I0312 17:12:01.009731 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555592-s9qm5"] Mar 12 17:12:01 crc kubenswrapper[5184]: I0312 17:12:01.010046 5184 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 17:12:01 crc kubenswrapper[5184]: I0312 17:12:01.747301 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555592-s9qm5" event={"ID":"464abcc7-9e49-4fff-8dbd-c4ce18f54bb8","Type":"ContainerStarted","Data":"60b4c1e11c1fbb10b452397fe56209d1e818ea4c29d05af4ab2660248db12c43"} Mar 12 17:12:02 crc kubenswrapper[5184]: I0312 17:12:02.762095 5184 generic.go:358] "Generic (PLEG): container finished" podID="464abcc7-9e49-4fff-8dbd-c4ce18f54bb8" containerID="3ee33983972038dd5e00575a6b54a7c0d08ba50e2c5488fb40a85a71c2bb8087" exitCode=0 Mar 12 17:12:02 crc kubenswrapper[5184]: I0312 17:12:02.762210 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555592-s9qm5" event={"ID":"464abcc7-9e49-4fff-8dbd-c4ce18f54bb8","Type":"ContainerDied","Data":"3ee33983972038dd5e00575a6b54a7c0d08ba50e2c5488fb40a85a71c2bb8087"} Mar 12 17:12:04 crc kubenswrapper[5184]: I0312 17:12:04.163867 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555592-s9qm5" Mar 12 17:12:04 crc kubenswrapper[5184]: I0312 17:12:04.188593 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dmqp\" (UniqueName: \"kubernetes.io/projected/464abcc7-9e49-4fff-8dbd-c4ce18f54bb8-kube-api-access-2dmqp\") pod \"464abcc7-9e49-4fff-8dbd-c4ce18f54bb8\" (UID: \"464abcc7-9e49-4fff-8dbd-c4ce18f54bb8\") " Mar 12 17:12:04 crc kubenswrapper[5184]: I0312 17:12:04.195688 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/464abcc7-9e49-4fff-8dbd-c4ce18f54bb8-kube-api-access-2dmqp" (OuterVolumeSpecName: "kube-api-access-2dmqp") pod "464abcc7-9e49-4fff-8dbd-c4ce18f54bb8" (UID: "464abcc7-9e49-4fff-8dbd-c4ce18f54bb8"). InnerVolumeSpecName "kube-api-access-2dmqp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:12:04 crc kubenswrapper[5184]: I0312 17:12:04.291045 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2dmqp\" (UniqueName: \"kubernetes.io/projected/464abcc7-9e49-4fff-8dbd-c4ce18f54bb8-kube-api-access-2dmqp\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:04 crc kubenswrapper[5184]: I0312 17:12:04.796571 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555592-s9qm5" Mar 12 17:12:04 crc kubenswrapper[5184]: I0312 17:12:04.796657 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555592-s9qm5" event={"ID":"464abcc7-9e49-4fff-8dbd-c4ce18f54bb8","Type":"ContainerDied","Data":"60b4c1e11c1fbb10b452397fe56209d1e818ea4c29d05af4ab2660248db12c43"} Mar 12 17:12:04 crc kubenswrapper[5184]: I0312 17:12:04.796713 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60b4c1e11c1fbb10b452397fe56209d1e818ea4c29d05af4ab2660248db12c43" Mar 12 17:12:05 crc kubenswrapper[5184]: I0312 17:12:05.249881 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555586-7mzjp"] Mar 12 17:12:05 crc kubenswrapper[5184]: I0312 17:12:05.264913 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555586-7mzjp"] Mar 12 17:12:05 crc kubenswrapper[5184]: I0312 17:12:05.420334 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 12 17:12:06 crc kubenswrapper[5184]: I0312 17:12:06.414172 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b98b237-d23d-424a-a15c-ec59149aba58" path="/var/lib/kubelet/pods/3b98b237-d23d-424a-a15c-ec59149aba58/volumes" Mar 12 17:12:15 crc kubenswrapper[5184]: I0312 17:12:15.047569 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 17:12:16 crc kubenswrapper[5184]: I0312 17:12:16.552258 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 17:12:19 crc kubenswrapper[5184]: I0312 17:12:19.087348 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="53e57ab8-13e6-4505-a905-412d3ef88083" containerName="rabbitmq" containerID="cri-o://de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097" gracePeriod=604796 Mar 12 17:12:20 crc kubenswrapper[5184]: I0312 17:12:20.131184 5184 prober.go:120] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="53e57ab8-13e6-4505-a905-412d3ef88083" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Mar 12 17:12:20 crc kubenswrapper[5184]: I0312 17:12:20.182811 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="56b9c26f-b490-4262-9c35-63ee5734c634" containerName="rabbitmq" containerID="cri-o://1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf" gracePeriod=604797 Mar 12 17:12:20 crc kubenswrapper[5184]: I0312 17:12:20.742330 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:12:20 crc kubenswrapper[5184]: I0312 17:12:20.742849 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.718210 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.852692 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-tls\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.852745 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-config-data\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.852797 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.852868 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-erlang-cookie\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.852902 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-confd\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.852975 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53e57ab8-13e6-4505-a905-412d3ef88083-erlang-cookie-secret\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.853000 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-plugins\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.853184 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-server-conf\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.853215 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53e57ab8-13e6-4505-a905-412d3ef88083-pod-info\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.853241 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-plugins-conf\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.853256 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r45ps\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-kube-api-access-r45ps\") pod \"53e57ab8-13e6-4505-a905-412d3ef88083\" (UID: \"53e57ab8-13e6-4505-a905-412d3ef88083\") " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.853277 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.854032 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.854229 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.855758 5184 reconciler_common.go:299] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.855939 5184 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.855949 5184 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.859132 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53e57ab8-13e6-4505-a905-412d3ef88083-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.860297 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.862255 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/53e57ab8-13e6-4505-a905-412d3ef88083-pod-info" (OuterVolumeSpecName: "pod-info") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.862361 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.871475 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-kube-api-access-r45ps" (OuterVolumeSpecName: "kube-api-access-r45ps") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "kube-api-access-r45ps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.902822 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-config-data" (OuterVolumeSpecName: "config-data") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.933067 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-server-conf" (OuterVolumeSpecName: "server-conf") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.957634 5184 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.957878 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.957998 5184 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.958135 5184 reconciler_common.go:299] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53e57ab8-13e6-4505-a905-412d3ef88083-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.958243 5184 reconciler_common.go:299] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53e57ab8-13e6-4505-a905-412d3ef88083-server-conf\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.958327 5184 reconciler_common.go:299] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53e57ab8-13e6-4505-a905-412d3ef88083-pod-info\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.958441 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r45ps\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-kube-api-access-r45ps\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.979006 5184 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 12 17:12:25 crc kubenswrapper[5184]: I0312 17:12:25.995251 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "53e57ab8-13e6-4505-a905-412d3ef88083" (UID: "53e57ab8-13e6-4505-a905-412d3ef88083"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.031793 5184 generic.go:358] "Generic (PLEG): container finished" podID="53e57ab8-13e6-4505-a905-412d3ef88083" containerID="de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097" exitCode=0 Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.032092 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53e57ab8-13e6-4505-a905-412d3ef88083","Type":"ContainerDied","Data":"de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097"} Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.032129 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53e57ab8-13e6-4505-a905-412d3ef88083","Type":"ContainerDied","Data":"9bc34545e49aea34055d9d8a2ea9a93f9d8639bf6be30ec3df37f1c3666980f9"} Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.032146 5184 scope.go:117] "RemoveContainer" containerID="de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.032288 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.061014 5184 reconciler_common.go:299] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.061046 5184 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53e57ab8-13e6-4505-a905-412d3ef88083-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.068713 5184 scope.go:117] "RemoveContainer" containerID="1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.078246 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.089113 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.108882 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.110137 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="464abcc7-9e49-4fff-8dbd-c4ce18f54bb8" containerName="oc" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.110257 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="464abcc7-9e49-4fff-8dbd-c4ce18f54bb8" containerName="oc" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.110338 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53e57ab8-13e6-4505-a905-412d3ef88083" containerName="setup-container" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.110423 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e57ab8-13e6-4505-a905-412d3ef88083" containerName="setup-container" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.111415 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53e57ab8-13e6-4505-a905-412d3ef88083" containerName="rabbitmq" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.111497 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e57ab8-13e6-4505-a905-412d3ef88083" containerName="rabbitmq" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.111767 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="53e57ab8-13e6-4505-a905-412d3ef88083" containerName="rabbitmq" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.111842 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="464abcc7-9e49-4fff-8dbd-c4ce18f54bb8" containerName="oc" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.116721 5184 scope.go:117] "RemoveContainer" containerID="de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097" Mar 12 17:12:26 crc kubenswrapper[5184]: E0312 17:12:26.117255 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097\": container with ID starting with de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097 not found: ID does not exist" containerID="de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.117310 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097"} err="failed to get container status \"de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097\": rpc error: code = NotFound desc = could not find container \"de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097\": container with ID starting with de01d0d469b98b9178ec21d18644b15d0164e473f615c62b06003d6aa2162097 not found: ID does not exist" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.117344 5184 scope.go:117] "RemoveContainer" containerID="1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464" Mar 12 17:12:26 crc kubenswrapper[5184]: E0312 17:12:26.117825 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464\": container with ID starting with 1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464 not found: ID does not exist" containerID="1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.117854 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464"} err="failed to get container status \"1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464\": rpc error: code = NotFound desc = could not find container \"1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464\": container with ID starting with 1c500261ee65047d0f5f54d5420bdfc27224456b856c0006d4d1e69acb2ed464 not found: ID does not exist" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.124641 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.127497 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-erlang-cookie\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.127619 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-server-conf\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.127719 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-config-data\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.127804 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-plugins-conf\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.127959 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-server-dockercfg-7fzrs\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.128215 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-default-user\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.129409 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-rabbitmq-svc\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.131106 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.264964 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474ecd3e-3438-4cf1-953e-115dcbc40119-config-data\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.265041 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.265105 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.265134 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/474ecd3e-3438-4cf1-953e-115dcbc40119-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.265399 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/474ecd3e-3438-4cf1-953e-115dcbc40119-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.265441 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.265458 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/474ecd3e-3438-4cf1-953e-115dcbc40119-server-conf\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.265508 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.265627 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/474ecd3e-3438-4cf1-953e-115dcbc40119-pod-info\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.265672 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlljh\" (UniqueName: \"kubernetes.io/projected/474ecd3e-3438-4cf1-953e-115dcbc40119-kube-api-access-wlljh\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.265735 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367332 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474ecd3e-3438-4cf1-953e-115dcbc40119-config-data\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367437 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367468 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367486 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/474ecd3e-3438-4cf1-953e-115dcbc40119-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367585 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/474ecd3e-3438-4cf1-953e-115dcbc40119-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367613 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367629 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/474ecd3e-3438-4cf1-953e-115dcbc40119-server-conf\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367652 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367689 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/474ecd3e-3438-4cf1-953e-115dcbc40119-pod-info\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367715 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlljh\" (UniqueName: \"kubernetes.io/projected/474ecd3e-3438-4cf1-953e-115dcbc40119-kube-api-access-wlljh\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.367750 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.368225 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.368333 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474ecd3e-3438-4cf1-953e-115dcbc40119-config-data\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.368798 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/474ecd3e-3438-4cf1-953e-115dcbc40119-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.369140 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/474ecd3e-3438-4cf1-953e-115dcbc40119-server-conf\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.370038 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.370462 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.372324 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.373869 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/474ecd3e-3438-4cf1-953e-115dcbc40119-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.374120 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/474ecd3e-3438-4cf1-953e-115dcbc40119-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.374139 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/474ecd3e-3438-4cf1-953e-115dcbc40119-pod-info\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.385952 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlljh\" (UniqueName: \"kubernetes.io/projected/474ecd3e-3438-4cf1-953e-115dcbc40119-kube-api-access-wlljh\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.408603 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"474ecd3e-3438-4cf1-953e-115dcbc40119\") " pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.418064 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53e57ab8-13e6-4505-a905-412d3ef88083" path="/var/lib/kubelet/pods/53e57ab8-13e6-4505-a905-412d3ef88083/volumes" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.455041 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.832117 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.980945 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwghs\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-kube-api-access-vwghs\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.980997 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-server-conf\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.981055 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56b9c26f-b490-4262-9c35-63ee5734c634-erlang-cookie-secret\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.981184 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-confd\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.981216 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56b9c26f-b490-4262-9c35-63ee5734c634-pod-info\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.981406 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-config-data\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.981536 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.981582 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-erlang-cookie\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.981628 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-plugins-conf\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.981697 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-tls\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.981753 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-plugins\") pod \"56b9c26f-b490-4262-9c35-63ee5734c634\" (UID: \"56b9c26f-b490-4262-9c35-63ee5734c634\") " Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.982092 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.982345 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.982472 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.982735 5184 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.982757 5184 reconciler_common.go:299] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.982770 5184 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.987045 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGIDValue "" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.987297 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56b9c26f-b490-4262-9c35-63ee5734c634-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.987315 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-kube-api-access-vwghs" (OuterVolumeSpecName: "kube-api-access-vwghs") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "kube-api-access-vwghs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.990234 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/56b9c26f-b490-4262-9c35-63ee5734c634-pod-info" (OuterVolumeSpecName: "pod-info") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Mar 12 17:12:26 crc kubenswrapper[5184]: I0312 17:12:26.995189 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.008923 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.031391 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-config-data" (OuterVolumeSpecName: "config-data") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.059813 5184 generic.go:358] "Generic (PLEG): container finished" podID="56b9c26f-b490-4262-9c35-63ee5734c634" containerID="1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf" exitCode=0 Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.060028 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.060183 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56b9c26f-b490-4262-9c35-63ee5734c634","Type":"ContainerDied","Data":"1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf"} Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.060279 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"56b9c26f-b490-4262-9c35-63ee5734c634","Type":"ContainerDied","Data":"2bac33f30ee22f3d50f182c453325bdb186d39faaffa8a93c4fe7f5e2cc8873c"} Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.060349 5184 scope.go:117] "RemoveContainer" containerID="1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.063180 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-server-conf" (OuterVolumeSpecName: "server-conf") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.076012 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"474ecd3e-3438-4cf1-953e-115dcbc40119","Type":"ContainerStarted","Data":"013e60fbaab0ea96f214d0509e5874f43e5add6a23ec8972e6885f80b4fae9c8"} Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.085547 5184 reconciler_common.go:299] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.085590 5184 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.085600 5184 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.085612 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwghs\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-kube-api-access-vwghs\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.085620 5184 reconciler_common.go:299] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56b9c26f-b490-4262-9c35-63ee5734c634-server-conf\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.085628 5184 reconciler_common.go:299] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56b9c26f-b490-4262-9c35-63ee5734c634-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.085636 5184 reconciler_common.go:299] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56b9c26f-b490-4262-9c35-63ee5734c634-pod-info\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.103223 5184 scope.go:117] "RemoveContainer" containerID="5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.114131 5184 operation_generator.go:895] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.129067 5184 scope.go:117] "RemoveContainer" containerID="1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf" Mar 12 17:12:27 crc kubenswrapper[5184]: E0312 17:12:27.129639 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf\": container with ID starting with 1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf not found: ID does not exist" containerID="1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.129680 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf"} err="failed to get container status \"1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf\": rpc error: code = NotFound desc = could not find container \"1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf\": container with ID starting with 1322a3a257cb0f3349d10a1e09e96651c84653799e5afe1e417245be1b848bcf not found: ID does not exist" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.129707 5184 scope.go:117] "RemoveContainer" containerID="5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22" Mar 12 17:12:27 crc kubenswrapper[5184]: E0312 17:12:27.130007 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22\": container with ID starting with 5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22 not found: ID does not exist" containerID="5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.130102 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22"} err="failed to get container status \"5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22\": rpc error: code = NotFound desc = could not find container \"5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22\": container with ID starting with 5946e7c77319bc4303becfa27068470c096cf79bfacc2db1f23178951578ef22 not found: ID does not exist" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.168138 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "56b9c26f-b490-4262-9c35-63ee5734c634" (UID: "56b9c26f-b490-4262-9c35-63ee5734c634"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.188122 5184 reconciler_common.go:299] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56b9c26f-b490-4262-9c35-63ee5734c634-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.188512 5184 reconciler_common.go:299] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.462411 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.470543 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.482069 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.483426 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56b9c26f-b490-4262-9c35-63ee5734c634" containerName="rabbitmq" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.483451 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b9c26f-b490-4262-9c35-63ee5734c634" containerName="rabbitmq" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.483478 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56b9c26f-b490-4262-9c35-63ee5734c634" containerName="setup-container" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.483484 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b9c26f-b490-4262-9c35-63ee5734c634" containerName="setup-container" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.483691 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="56b9c26f-b490-4262-9c35-63ee5734c634" containerName="rabbitmq" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.492130 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.494331 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-erlang-cookie\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.494760 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-server-conf\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.494831 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-plugins-conf\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.496924 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"rabbitmq-cell1-config-data\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.497140 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-default-user\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.502339 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"rabbitmq-cell1-server-dockercfg-9pbd8\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.503127 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"cert-rabbitmq-cell1-svc\"" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.510017 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.596635 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.596918 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxwgd\" (UniqueName: \"kubernetes.io/projected/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-kube-api-access-wxwgd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.597033 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.597126 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.597216 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.597307 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.597415 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.597566 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.597619 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.597661 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.597696 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.699490 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.699958 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.700098 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.700181 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.700276 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.700361 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.700487 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.699705 5184 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.700750 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.700978 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.701097 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxwgd\" (UniqueName: \"kubernetes.io/projected/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-kube-api-access-wxwgd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.701501 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.701609 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.702020 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.701168 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.701246 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.703202 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.705658 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.706332 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.723217 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.731089 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxwgd\" (UniqueName: \"kubernetes.io/projected/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-kube-api-access-wxwgd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.739013 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4f29a0e3-8a0d-42dd-b7f8-d9123e29035b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.753605 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:27 crc kubenswrapper[5184]: I0312 17:12:27.871535 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:12:28 crc kubenswrapper[5184]: I0312 17:12:28.345024 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 17:12:28 crc kubenswrapper[5184]: W0312 17:12:28.394152 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f29a0e3_8a0d_42dd_b7f8_d9123e29035b.slice/crio-aba0dd51d15104f7c45bbdf14a6a89de3db8ad587d19fa4a91528bda22de17ec WatchSource:0}: Error finding container aba0dd51d15104f7c45bbdf14a6a89de3db8ad587d19fa4a91528bda22de17ec: Status 404 returned error can't find the container with id aba0dd51d15104f7c45bbdf14a6a89de3db8ad587d19fa4a91528bda22de17ec Mar 12 17:12:28 crc kubenswrapper[5184]: I0312 17:12:28.423304 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56b9c26f-b490-4262-9c35-63ee5734c634" path="/var/lib/kubelet/pods/56b9c26f-b490-4262-9c35-63ee5734c634/volumes" Mar 12 17:12:29 crc kubenswrapper[5184]: I0312 17:12:29.106578 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"474ecd3e-3438-4cf1-953e-115dcbc40119","Type":"ContainerStarted","Data":"c452ca415bf6be7fe6bddd9cc6818bce5a0db052d53ec75d000850a63d76fcea"} Mar 12 17:12:29 crc kubenswrapper[5184]: I0312 17:12:29.109976 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b","Type":"ContainerStarted","Data":"aba0dd51d15104f7c45bbdf14a6a89de3db8ad587d19fa4a91528bda22de17ec"} Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.073257 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ccf6665d7-m78ds"] Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.079657 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.081673 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-edpm-ipam\"" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.087961 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ccf6665d7-m78ds"] Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.248940 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-swift-storage-0\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.248998 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-sb\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.249041 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dxwz\" (UniqueName: \"kubernetes.io/projected/8a5e6590-fa06-4b8f-8a40-58bb60300a93-kube-api-access-6dxwz\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.249276 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-nb\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.249302 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-config\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.249417 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-openstack-edpm-ipam\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.249445 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-svc\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.351251 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-nb\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.351295 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-config\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.351349 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-openstack-edpm-ipam\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.351367 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-svc\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.351404 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-swift-storage-0\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.351433 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-sb\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.351451 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dxwz\" (UniqueName: \"kubernetes.io/projected/8a5e6590-fa06-4b8f-8a40-58bb60300a93-kube-api-access-6dxwz\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.352322 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-nb\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.352535 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-swift-storage-0\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.352850 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-svc\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.353053 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-sb\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.353059 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-config\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.353394 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-openstack-edpm-ipam\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.374448 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dxwz\" (UniqueName: \"kubernetes.io/projected/8a5e6590-fa06-4b8f-8a40-58bb60300a93-kube-api-access-6dxwz\") pod \"dnsmasq-dns-7ccf6665d7-m78ds\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.398768 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:30 crc kubenswrapper[5184]: I0312 17:12:30.943334 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ccf6665d7-m78ds"] Mar 12 17:12:31 crc kubenswrapper[5184]: I0312 17:12:31.135556 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" event={"ID":"8a5e6590-fa06-4b8f-8a40-58bb60300a93","Type":"ContainerStarted","Data":"7c5ca817cece149615ea5c96bd1a724364729610da86333b22a0b15c00634f9a"} Mar 12 17:12:31 crc kubenswrapper[5184]: I0312 17:12:31.137893 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b","Type":"ContainerStarted","Data":"a34ba0790b9cdec96c668be1218b444a950fab85c1a0c2a712125dccb0da520f"} Mar 12 17:12:32 crc kubenswrapper[5184]: I0312 17:12:32.148998 5184 generic.go:358] "Generic (PLEG): container finished" podID="8a5e6590-fa06-4b8f-8a40-58bb60300a93" containerID="d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959" exitCode=0 Mar 12 17:12:32 crc kubenswrapper[5184]: I0312 17:12:32.149053 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" event={"ID":"8a5e6590-fa06-4b8f-8a40-58bb60300a93","Type":"ContainerDied","Data":"d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959"} Mar 12 17:12:33 crc kubenswrapper[5184]: I0312 17:12:33.164618 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" event={"ID":"8a5e6590-fa06-4b8f-8a40-58bb60300a93","Type":"ContainerStarted","Data":"4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758"} Mar 12 17:12:33 crc kubenswrapper[5184]: I0312 17:12:33.166347 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:33 crc kubenswrapper[5184]: I0312 17:12:33.199218 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" podStartSLOduration=3.199190153 podStartE2EDuration="3.199190153s" podCreationTimestamp="2026-03-12 17:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:12:33.184178376 +0000 UTC m=+1295.725489745" watchObservedRunningTime="2026-03-12 17:12:33.199190153 +0000 UTC m=+1295.740501532" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.248324 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.353041 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5598b9c58f-84z9t"] Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.353395 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" podUID="ea315f0f-053b-44e2-b1ea-1058f7a51635" containerName="dnsmasq-dns" containerID="cri-o://1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596" gracePeriod=10 Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.471678 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-579578d6d7-xfplt"] Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.528296 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-579578d6d7-xfplt"] Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.528860 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: E0312 17:12:39.655211 5184 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea315f0f_053b_44e2_b1ea_1058f7a51635.slice/crio-conmon-1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596.scope\": RecentStats: unable to find data in memory cache]" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.674631 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-openstack-edpm-ipam\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.674945 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-dns-svc\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.674993 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-config\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.675008 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-dns-swift-storage-0\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.675022 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jft2w\" (UniqueName: \"kubernetes.io/projected/772747ff-b4c7-4fda-a596-6aa41c6c46cd-kube-api-access-jft2w\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.675230 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-ovsdbserver-sb\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.675528 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-ovsdbserver-nb\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.777078 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-config\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.777136 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-dns-swift-storage-0\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.777156 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jft2w\" (UniqueName: \"kubernetes.io/projected/772747ff-b4c7-4fda-a596-6aa41c6c46cd-kube-api-access-jft2w\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.777191 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-ovsdbserver-sb\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.777286 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-ovsdbserver-nb\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.777367 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-openstack-edpm-ipam\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.777401 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-dns-svc\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.778238 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-dns-svc\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.778947 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-config\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.779492 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-dns-swift-storage-0\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.780572 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-ovsdbserver-sb\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.781095 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-ovsdbserver-nb\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.781421 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/772747ff-b4c7-4fda-a596-6aa41c6c46cd-openstack-edpm-ipam\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.804161 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jft2w\" (UniqueName: \"kubernetes.io/projected/772747ff-b4c7-4fda-a596-6aa41c6c46cd-kube-api-access-jft2w\") pod \"dnsmasq-dns-579578d6d7-xfplt\" (UID: \"772747ff-b4c7-4fda-a596-6aa41c6c46cd\") " pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.860353 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:39 crc kubenswrapper[5184]: I0312 17:12:39.976711 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.083177 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmvrr\" (UniqueName: \"kubernetes.io/projected/ea315f0f-053b-44e2-b1ea-1058f7a51635-kube-api-access-fmvrr\") pod \"ea315f0f-053b-44e2-b1ea-1058f7a51635\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.083302 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-config\") pod \"ea315f0f-053b-44e2-b1ea-1058f7a51635\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.083340 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-svc\") pod \"ea315f0f-053b-44e2-b1ea-1058f7a51635\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.083586 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-swift-storage-0\") pod \"ea315f0f-053b-44e2-b1ea-1058f7a51635\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.083693 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-sb\") pod \"ea315f0f-053b-44e2-b1ea-1058f7a51635\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.083792 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-nb\") pod \"ea315f0f-053b-44e2-b1ea-1058f7a51635\" (UID: \"ea315f0f-053b-44e2-b1ea-1058f7a51635\") " Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.088701 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea315f0f-053b-44e2-b1ea-1058f7a51635-kube-api-access-fmvrr" (OuterVolumeSpecName: "kube-api-access-fmvrr") pod "ea315f0f-053b-44e2-b1ea-1058f7a51635" (UID: "ea315f0f-053b-44e2-b1ea-1058f7a51635"). InnerVolumeSpecName "kube-api-access-fmvrr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.138861 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-config" (OuterVolumeSpecName: "config") pod "ea315f0f-053b-44e2-b1ea-1058f7a51635" (UID: "ea315f0f-053b-44e2-b1ea-1058f7a51635"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.139847 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ea315f0f-053b-44e2-b1ea-1058f7a51635" (UID: "ea315f0f-053b-44e2-b1ea-1058f7a51635"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.150022 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea315f0f-053b-44e2-b1ea-1058f7a51635" (UID: "ea315f0f-053b-44e2-b1ea-1058f7a51635"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.152908 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ea315f0f-053b-44e2-b1ea-1058f7a51635" (UID: "ea315f0f-053b-44e2-b1ea-1058f7a51635"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.155608 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ea315f0f-053b-44e2-b1ea-1058f7a51635" (UID: "ea315f0f-053b-44e2-b1ea-1058f7a51635"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.186112 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.186146 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.186155 5184 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.186167 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.186195 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea315f0f-053b-44e2-b1ea-1058f7a51635-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.186203 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmvrr\" (UniqueName: \"kubernetes.io/projected/ea315f0f-053b-44e2-b1ea-1058f7a51635-kube-api-access-fmvrr\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.314702 5184 generic.go:358] "Generic (PLEG): container finished" podID="ea315f0f-053b-44e2-b1ea-1058f7a51635" containerID="1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596" exitCode=0 Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.314753 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" event={"ID":"ea315f0f-053b-44e2-b1ea-1058f7a51635","Type":"ContainerDied","Data":"1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596"} Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.315138 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" event={"ID":"ea315f0f-053b-44e2-b1ea-1058f7a51635","Type":"ContainerDied","Data":"f61836aff649b61a28e56a4e4d8e4f38866d5fc630f597a81beaeb9f078f9f81"} Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.315177 5184 scope.go:117] "RemoveContainer" containerID="1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.314783 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5598b9c58f-84z9t" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.347979 5184 scope.go:117] "RemoveContainer" containerID="6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da" Mar 12 17:12:40 crc kubenswrapper[5184]: W0312 17:12:40.353075 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod772747ff_b4c7_4fda_a596_6aa41c6c46cd.slice/crio-b303756c6160d0670886a78769c528ead55d35df0f21b545410f3394fd78d09d WatchSource:0}: Error finding container b303756c6160d0670886a78769c528ead55d35df0f21b545410f3394fd78d09d: Status 404 returned error can't find the container with id b303756c6160d0670886a78769c528ead55d35df0f21b545410f3394fd78d09d Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.356242 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-579578d6d7-xfplt"] Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.367417 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5598b9c58f-84z9t"] Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.375608 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5598b9c58f-84z9t"] Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.379681 5184 scope.go:117] "RemoveContainer" containerID="1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596" Mar 12 17:12:40 crc kubenswrapper[5184]: E0312 17:12:40.380081 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596\": container with ID starting with 1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596 not found: ID does not exist" containerID="1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.380123 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596"} err="failed to get container status \"1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596\": rpc error: code = NotFound desc = could not find container \"1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596\": container with ID starting with 1e112018ab9300039cf4b79e01b3a014ba6715494ee4cff6548649db87d7c596 not found: ID does not exist" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.380149 5184 scope.go:117] "RemoveContainer" containerID="6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da" Mar 12 17:12:40 crc kubenswrapper[5184]: E0312 17:12:40.380472 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da\": container with ID starting with 6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da not found: ID does not exist" containerID="6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.380510 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da"} err="failed to get container status \"6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da\": rpc error: code = NotFound desc = could not find container \"6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da\": container with ID starting with 6851dae20c684086aaf272cf59a9d4fb940f217552dadcac0cd29575e2e6a5da not found: ID does not exist" Mar 12 17:12:40 crc kubenswrapper[5184]: I0312 17:12:40.417005 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea315f0f-053b-44e2-b1ea-1058f7a51635" path="/var/lib/kubelet/pods/ea315f0f-053b-44e2-b1ea-1058f7a51635/volumes" Mar 12 17:12:41 crc kubenswrapper[5184]: I0312 17:12:41.326679 5184 generic.go:358] "Generic (PLEG): container finished" podID="772747ff-b4c7-4fda-a596-6aa41c6c46cd" containerID="74ce582cf222511729cc94e9e881ac2f66603712c39022af08472b8965704fdd" exitCode=0 Mar 12 17:12:41 crc kubenswrapper[5184]: I0312 17:12:41.326761 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579578d6d7-xfplt" event={"ID":"772747ff-b4c7-4fda-a596-6aa41c6c46cd","Type":"ContainerDied","Data":"74ce582cf222511729cc94e9e881ac2f66603712c39022af08472b8965704fdd"} Mar 12 17:12:41 crc kubenswrapper[5184]: I0312 17:12:41.327141 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579578d6d7-xfplt" event={"ID":"772747ff-b4c7-4fda-a596-6aa41c6c46cd","Type":"ContainerStarted","Data":"b303756c6160d0670886a78769c528ead55d35df0f21b545410f3394fd78d09d"} Mar 12 17:12:42 crc kubenswrapper[5184]: I0312 17:12:42.342497 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579578d6d7-xfplt" event={"ID":"772747ff-b4c7-4fda-a596-6aa41c6c46cd","Type":"ContainerStarted","Data":"80ceb57a5b62bef7da1f076b75849605ffcff6087f9324eecece00547cafb642"} Mar 12 17:12:42 crc kubenswrapper[5184]: I0312 17:12:42.343225 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:42 crc kubenswrapper[5184]: I0312 17:12:42.377796 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-579578d6d7-xfplt" podStartSLOduration=3.377772506 podStartE2EDuration="3.377772506s" podCreationTimestamp="2026-03-12 17:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:12:42.367247338 +0000 UTC m=+1304.908558697" watchObservedRunningTime="2026-03-12 17:12:42.377772506 +0000 UTC m=+1304.919083855" Mar 12 17:12:48 crc kubenswrapper[5184]: I0312 17:12:48.357408 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-579578d6d7-xfplt" Mar 12 17:12:48 crc kubenswrapper[5184]: I0312 17:12:48.510363 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ccf6665d7-m78ds"] Mar 12 17:12:48 crc kubenswrapper[5184]: I0312 17:12:48.510714 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" podUID="8a5e6590-fa06-4b8f-8a40-58bb60300a93" containerName="dnsmasq-dns" containerID="cri-o://4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758" gracePeriod=10 Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.010088 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.084033 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-swift-storage-0\") pod \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.084399 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-sb\") pod \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.084437 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-openstack-edpm-ipam\") pod \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.084686 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dxwz\" (UniqueName: \"kubernetes.io/projected/8a5e6590-fa06-4b8f-8a40-58bb60300a93-kube-api-access-6dxwz\") pod \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.084749 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-config\") pod \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.084927 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-nb\") pod \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.084990 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-svc\") pod \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\" (UID: \"8a5e6590-fa06-4b8f-8a40-58bb60300a93\") " Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.108428 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a5e6590-fa06-4b8f-8a40-58bb60300a93-kube-api-access-6dxwz" (OuterVolumeSpecName: "kube-api-access-6dxwz") pod "8a5e6590-fa06-4b8f-8a40-58bb60300a93" (UID: "8a5e6590-fa06-4b8f-8a40-58bb60300a93"). InnerVolumeSpecName "kube-api-access-6dxwz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.150508 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8a5e6590-fa06-4b8f-8a40-58bb60300a93" (UID: "8a5e6590-fa06-4b8f-8a40-58bb60300a93"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.151652 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8a5e6590-fa06-4b8f-8a40-58bb60300a93" (UID: "8a5e6590-fa06-4b8f-8a40-58bb60300a93"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.155013 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a5e6590-fa06-4b8f-8a40-58bb60300a93" (UID: "8a5e6590-fa06-4b8f-8a40-58bb60300a93"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.161235 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8a5e6590-fa06-4b8f-8a40-58bb60300a93" (UID: "8a5e6590-fa06-4b8f-8a40-58bb60300a93"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.165027 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "8a5e6590-fa06-4b8f-8a40-58bb60300a93" (UID: "8a5e6590-fa06-4b8f-8a40-58bb60300a93"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.168165 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-config" (OuterVolumeSpecName: "config") pod "8a5e6590-fa06-4b8f-8a40-58bb60300a93" (UID: "8a5e6590-fa06-4b8f-8a40-58bb60300a93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.187504 5184 reconciler_common.go:299] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.187537 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.187547 5184 reconciler_common.go:299] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.187557 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dxwz\" (UniqueName: \"kubernetes.io/projected/8a5e6590-fa06-4b8f-8a40-58bb60300a93-kube-api-access-6dxwz\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.187567 5184 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-config\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.187576 5184 reconciler_common.go:299] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.187614 5184 reconciler_common.go:299] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a5e6590-fa06-4b8f-8a40-58bb60300a93-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.524001 5184 generic.go:358] "Generic (PLEG): container finished" podID="8a5e6590-fa06-4b8f-8a40-58bb60300a93" containerID="4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758" exitCode=0 Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.524205 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.524367 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" event={"ID":"8a5e6590-fa06-4b8f-8a40-58bb60300a93","Type":"ContainerDied","Data":"4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758"} Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.524418 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ccf6665d7-m78ds" event={"ID":"8a5e6590-fa06-4b8f-8a40-58bb60300a93","Type":"ContainerDied","Data":"7c5ca817cece149615ea5c96bd1a724364729610da86333b22a0b15c00634f9a"} Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.524438 5184 scope.go:117] "RemoveContainer" containerID="4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.563766 5184 scope.go:117] "RemoveContainer" containerID="d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.575555 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ccf6665d7-m78ds"] Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.585136 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ccf6665d7-m78ds"] Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.611264 5184 scope.go:117] "RemoveContainer" containerID="4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758" Mar 12 17:12:49 crc kubenswrapper[5184]: E0312 17:12:49.611699 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758\": container with ID starting with 4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758 not found: ID does not exist" containerID="4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.611761 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758"} err="failed to get container status \"4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758\": rpc error: code = NotFound desc = could not find container \"4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758\": container with ID starting with 4d98dcc0f431fc0784278338a97f1e0ab1a185ebb2686f6ab3755e3aeae70758 not found: ID does not exist" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.611794 5184 scope.go:117] "RemoveContainer" containerID="d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959" Mar 12 17:12:49 crc kubenswrapper[5184]: E0312 17:12:49.612170 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959\": container with ID starting with d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959 not found: ID does not exist" containerID="d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959" Mar 12 17:12:49 crc kubenswrapper[5184]: I0312 17:12:49.612207 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959"} err="failed to get container status \"d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959\": rpc error: code = NotFound desc = could not find container \"d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959\": container with ID starting with d63b327ea635f0ae20bf6cd9dffc085f106771a51ed11b17b59c469762110959 not found: ID does not exist" Mar 12 17:12:50 crc kubenswrapper[5184]: E0312 17:12:50.263411 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1791679 actualBytes=10240 Mar 12 17:12:50 crc kubenswrapper[5184]: I0312 17:12:50.422576 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a5e6590-fa06-4b8f-8a40-58bb60300a93" path="/var/lib/kubelet/pods/8a5e6590-fa06-4b8f-8a40-58bb60300a93/volumes" Mar 12 17:12:50 crc kubenswrapper[5184]: I0312 17:12:50.742826 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:12:50 crc kubenswrapper[5184]: I0312 17:12:50.742936 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.543026 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv"] Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.544793 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea315f0f-053b-44e2-b1ea-1058f7a51635" containerName="dnsmasq-dns" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.544811 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea315f0f-053b-44e2-b1ea-1058f7a51635" containerName="dnsmasq-dns" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.544827 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea315f0f-053b-44e2-b1ea-1058f7a51635" containerName="init" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.544833 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea315f0f-053b-44e2-b1ea-1058f7a51635" containerName="init" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.544849 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a5e6590-fa06-4b8f-8a40-58bb60300a93" containerName="init" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.544856 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5e6590-fa06-4b8f-8a40-58bb60300a93" containerName="init" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.544893 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a5e6590-fa06-4b8f-8a40-58bb60300a93" containerName="dnsmasq-dns" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.544900 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5e6590-fa06-4b8f-8a40-58bb60300a93" containerName="dnsmasq-dns" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.545130 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a5e6590-fa06-4b8f-8a40-58bb60300a93" containerName="dnsmasq-dns" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.545147 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea315f0f-053b-44e2-b1ea-1058f7a51635" containerName="dnsmasq-dns" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.554352 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.554643 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv"] Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.563121 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.563441 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\"" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.563461 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\"" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.563765 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-qr8nl\"" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.737195 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.737301 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6dn6\" (UniqueName: \"kubernetes.io/projected/37d0ff5d-5881-483f-ab1f-ff2385c623ad-kube-api-access-z6dn6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.737368 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.737436 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.839844 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6dn6\" (UniqueName: \"kubernetes.io/projected/37d0ff5d-5881-483f-ab1f-ff2385c623ad-kube-api-access-z6dn6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.839901 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.839921 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.840113 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.846531 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.848011 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.854195 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.865927 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6dn6\" (UniqueName: \"kubernetes.io/projected/37d0ff5d-5881-483f-ab1f-ff2385c623ad-kube-api-access-z6dn6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:55 crc kubenswrapper[5184]: I0312 17:12:55.876803 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:12:56 crc kubenswrapper[5184]: I0312 17:12:56.429516 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv"] Mar 12 17:12:56 crc kubenswrapper[5184]: I0312 17:12:56.616815 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" event={"ID":"37d0ff5d-5881-483f-ab1f-ff2385c623ad","Type":"ContainerStarted","Data":"ca591db6b98d220b7f6c779a4cf35e8df7afe4f99d3403c5df1a7562912df091"} Mar 12 17:13:01 crc kubenswrapper[5184]: I0312 17:13:01.663808 5184 generic.go:358] "Generic (PLEG): container finished" podID="474ecd3e-3438-4cf1-953e-115dcbc40119" containerID="c452ca415bf6be7fe6bddd9cc6818bce5a0db052d53ec75d000850a63d76fcea" exitCode=0 Mar 12 17:13:01 crc kubenswrapper[5184]: I0312 17:13:01.663966 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"474ecd3e-3438-4cf1-953e-115dcbc40119","Type":"ContainerDied","Data":"c452ca415bf6be7fe6bddd9cc6818bce5a0db052d53ec75d000850a63d76fcea"} Mar 12 17:13:02 crc kubenswrapper[5184]: I0312 17:13:02.675502 5184 generic.go:358] "Generic (PLEG): container finished" podID="4f29a0e3-8a0d-42dd-b7f8-d9123e29035b" containerID="a34ba0790b9cdec96c668be1218b444a950fab85c1a0c2a712125dccb0da520f" exitCode=0 Mar 12 17:13:02 crc kubenswrapper[5184]: I0312 17:13:02.675571 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b","Type":"ContainerDied","Data":"a34ba0790b9cdec96c668be1218b444a950fab85c1a0c2a712125dccb0da520f"} Mar 12 17:13:05 crc kubenswrapper[5184]: I0312 17:13:05.983678 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:13:06 crc kubenswrapper[5184]: I0312 17:13:06.726704 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"474ecd3e-3438-4cf1-953e-115dcbc40119","Type":"ContainerStarted","Data":"920ad7c5473394ca46726c9db2cadfe41045a86e79e2bdefbda50a0546c6fd3d"} Mar 12 17:13:06 crc kubenswrapper[5184]: I0312 17:13:06.729263 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/rabbitmq-server-0" Mar 12 17:13:06 crc kubenswrapper[5184]: I0312 17:13:06.734982 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4f29a0e3-8a0d-42dd-b7f8-d9123e29035b","Type":"ContainerStarted","Data":"5099c219bdbc50b117299972651f596b013dc593ff8f5a11449b98fa3c5a6bf2"} Mar 12 17:13:06 crc kubenswrapper[5184]: I0312 17:13:06.735362 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:13:06 crc kubenswrapper[5184]: I0312 17:13:06.738213 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" event={"ID":"37d0ff5d-5881-483f-ab1f-ff2385c623ad","Type":"ContainerStarted","Data":"feffca00cec232b3cb3f45fa51ee53d6d297fdf979e7f23054b0aeb3862b2a83"} Mar 12 17:13:06 crc kubenswrapper[5184]: I0312 17:13:06.778952 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.778921499 podStartE2EDuration="40.778921499s" podCreationTimestamp="2026-03-12 17:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:13:06.763708905 +0000 UTC m=+1329.305020294" watchObservedRunningTime="2026-03-12 17:13:06.778921499 +0000 UTC m=+1329.320232878" Mar 12 17:13:06 crc kubenswrapper[5184]: I0312 17:13:06.795490 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" podStartSLOduration=2.252090827 podStartE2EDuration="11.795469206s" podCreationTimestamp="2026-03-12 17:12:55 +0000 UTC" firstStartedPulling="2026-03-12 17:12:56.437307902 +0000 UTC m=+1318.978619261" lastFinishedPulling="2026-03-12 17:13:05.980686291 +0000 UTC m=+1328.521997640" observedRunningTime="2026-03-12 17:13:06.788114576 +0000 UTC m=+1329.329425945" watchObservedRunningTime="2026-03-12 17:13:06.795469206 +0000 UTC m=+1329.336780555" Mar 12 17:13:06 crc kubenswrapper[5184]: I0312 17:13:06.823732 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.823708846 podStartE2EDuration="39.823708846s" podCreationTimestamp="2026-03-12 17:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:13:06.812655671 +0000 UTC m=+1329.353967010" watchObservedRunningTime="2026-03-12 17:13:06.823708846 +0000 UTC m=+1329.365020195" Mar 12 17:13:16 crc kubenswrapper[5184]: I0312 17:13:16.850416 5184 generic.go:358] "Generic (PLEG): container finished" podID="37d0ff5d-5881-483f-ab1f-ff2385c623ad" containerID="feffca00cec232b3cb3f45fa51ee53d6d297fdf979e7f23054b0aeb3862b2a83" exitCode=0 Mar 12 17:13:16 crc kubenswrapper[5184]: I0312 17:13:16.851576 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" event={"ID":"37d0ff5d-5881-483f-ab1f-ff2385c623ad","Type":"ContainerDied","Data":"feffca00cec232b3cb3f45fa51ee53d6d297fdf979e7f23054b0aeb3862b2a83"} Mar 12 17:13:17 crc kubenswrapper[5184]: I0312 17:13:17.723698 5184 scope.go:117] "RemoveContainer" containerID="23c501bb01a1d9530c9bebbac8cdf1029b7f9e84ac59e4f2c52258f70e1429d4" Mar 12 17:13:17 crc kubenswrapper[5184]: I0312 17:13:17.753413 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 12 17:13:17 crc kubenswrapper[5184]: I0312 17:13:17.754717 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 17:13:17 crc kubenswrapper[5184]: I0312 17:13:17.762914 5184 scope.go:117] "RemoveContainer" containerID="3aa5746447337e8f540aac81c42f999892b313ce2dd32bb753981198cf2bd6b6" Mar 12 17:13:17 crc kubenswrapper[5184]: I0312 17:13:17.812358 5184 scope.go:117] "RemoveContainer" containerID="2f0b38f45ba943215b903a45946497d845e93c8ebc8188a0302f138e3bd1247b" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.389260 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.410899 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-ssh-key-openstack-edpm-ipam\") pod \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.411052 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-repo-setup-combined-ca-bundle\") pod \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.411194 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6dn6\" (UniqueName: \"kubernetes.io/projected/37d0ff5d-5881-483f-ab1f-ff2385c623ad-kube-api-access-z6dn6\") pod \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.411243 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-inventory\") pod \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\" (UID: \"37d0ff5d-5881-483f-ab1f-ff2385c623ad\") " Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.418168 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "37d0ff5d-5881-483f-ab1f-ff2385c623ad" (UID: "37d0ff5d-5881-483f-ab1f-ff2385c623ad"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.424175 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d0ff5d-5881-483f-ab1f-ff2385c623ad-kube-api-access-z6dn6" (OuterVolumeSpecName: "kube-api-access-z6dn6") pod "37d0ff5d-5881-483f-ab1f-ff2385c623ad" (UID: "37d0ff5d-5881-483f-ab1f-ff2385c623ad"). InnerVolumeSpecName "kube-api-access-z6dn6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.452505 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "37d0ff5d-5881-483f-ab1f-ff2385c623ad" (UID: "37d0ff5d-5881-483f-ab1f-ff2385c623ad"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.455131 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-inventory" (OuterVolumeSpecName: "inventory") pod "37d0ff5d-5881-483f-ab1f-ff2385c623ad" (UID: "37d0ff5d-5881-483f-ab1f-ff2385c623ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.514390 5184 reconciler_common.go:299] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.514424 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6dn6\" (UniqueName: \"kubernetes.io/projected/37d0ff5d-5881-483f-ab1f-ff2385c623ad-kube-api-access-z6dn6\") on node \"crc\" DevicePath \"\"" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.514435 5184 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.514443 5184 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37d0ff5d-5881-483f-ab1f-ff2385c623ad-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.882596 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" event={"ID":"37d0ff5d-5881-483f-ab1f-ff2385c623ad","Type":"ContainerDied","Data":"ca591db6b98d220b7f6c779a4cf35e8df7afe4f99d3403c5df1a7562912df091"} Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.882866 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca591db6b98d220b7f6c779a4cf35e8df7afe4f99d3403c5df1a7562912df091" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.882618 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.957480 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb"] Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.958461 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37d0ff5d-5881-483f-ab1f-ff2385c623ad" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.958478 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d0ff5d-5881-483f-ab1f-ff2385c623ad" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.958684 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="37d0ff5d-5881-483f-ab1f-ff2385c623ad" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.964727 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.968898 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\"" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.969185 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.969324 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-qr8nl\"" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.971460 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\"" Mar 12 17:13:18 crc kubenswrapper[5184]: I0312 17:13:18.987700 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb"] Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.044094 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.044467 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.044615 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp5lr\" (UniqueName: \"kubernetes.io/projected/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-kube-api-access-kp5lr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.044733 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.146553 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.146946 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5lr\" (UniqueName: \"kubernetes.io/projected/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-kube-api-access-kp5lr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.147065 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.147225 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.151980 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.152299 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.157220 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.161896 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5lr\" (UniqueName: \"kubernetes.io/projected/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-kube-api-access-kp5lr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.289803 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:13:19 crc kubenswrapper[5184]: W0312 17:13:19.844963 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00325ca6_5bba_4ac7_8ef7_0b21163fe2af.slice/crio-7c64015948fc430d3fc67bafaa5c75d73842ea8057c1f66c3e8f0d761f129ee5 WatchSource:0}: Error finding container 7c64015948fc430d3fc67bafaa5c75d73842ea8057c1f66c3e8f0d761f129ee5: Status 404 returned error can't find the container with id 7c64015948fc430d3fc67bafaa5c75d73842ea8057c1f66c3e8f0d761f129ee5 Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.845331 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb"] Mar 12 17:13:19 crc kubenswrapper[5184]: I0312 17:13:19.894529 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" event={"ID":"00325ca6-5bba-4ac7-8ef7-0b21163fe2af","Type":"ContainerStarted","Data":"7c64015948fc430d3fc67bafaa5c75d73842ea8057c1f66c3e8f0d761f129ee5"} Mar 12 17:13:20 crc kubenswrapper[5184]: I0312 17:13:20.742662 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:13:20 crc kubenswrapper[5184]: I0312 17:13:20.743079 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:13:20 crc kubenswrapper[5184]: I0312 17:13:20.743157 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 17:13:20 crc kubenswrapper[5184]: I0312 17:13:20.744367 5184 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42ed46ee5dbf0d27675a5969e00cdc1d30283a154524122596ea10898f42720f"} pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 17:13:20 crc kubenswrapper[5184]: I0312 17:13:20.744492 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" containerID="cri-o://42ed46ee5dbf0d27675a5969e00cdc1d30283a154524122596ea10898f42720f" gracePeriod=600 Mar 12 17:13:20 crc kubenswrapper[5184]: I0312 17:13:20.912707 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" event={"ID":"00325ca6-5bba-4ac7-8ef7-0b21163fe2af","Type":"ContainerStarted","Data":"6d8a5d243e7a8e9238416e7eb516fd42474b99f8b9dbe0824ed08e85aeedd45e"} Mar 12 17:13:20 crc kubenswrapper[5184]: I0312 17:13:20.915921 5184 generic.go:358] "Generic (PLEG): container finished" podID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerID="42ed46ee5dbf0d27675a5969e00cdc1d30283a154524122596ea10898f42720f" exitCode=0 Mar 12 17:13:20 crc kubenswrapper[5184]: I0312 17:13:20.916148 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerDied","Data":"42ed46ee5dbf0d27675a5969e00cdc1d30283a154524122596ea10898f42720f"} Mar 12 17:13:20 crc kubenswrapper[5184]: I0312 17:13:20.916210 5184 scope.go:117] "RemoveContainer" containerID="e97f86449204164890c97bdd96ba2e452210b3be2c7fc1815ab56658e4653bed" Mar 12 17:13:20 crc kubenswrapper[5184]: I0312 17:13:20.941121 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" podStartSLOduration=2.473394796 podStartE2EDuration="2.941100854s" podCreationTimestamp="2026-03-12 17:13:18 +0000 UTC" firstStartedPulling="2026-03-12 17:13:19.847066219 +0000 UTC m=+1342.388377558" lastFinishedPulling="2026-03-12 17:13:20.314772277 +0000 UTC m=+1342.856083616" observedRunningTime="2026-03-12 17:13:20.935150849 +0000 UTC m=+1343.476462188" watchObservedRunningTime="2026-03-12 17:13:20.941100854 +0000 UTC m=+1343.482412193" Mar 12 17:13:21 crc kubenswrapper[5184]: I0312 17:13:21.932367 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"7539a33836cf02bc296a008cedc3ee58f1ef87b38ca5b5f9414731708b87618f"} Mar 12 17:13:49 crc kubenswrapper[5184]: E0312 17:13:49.112623 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1794103 actualBytes=10240 Mar 12 17:14:00 crc kubenswrapper[5184]: I0312 17:14:00.152103 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555594-6csj4"] Mar 12 17:14:00 crc kubenswrapper[5184]: I0312 17:14:00.308428 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555594-6csj4"] Mar 12 17:14:00 crc kubenswrapper[5184]: I0312 17:14:00.308566 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555594-6csj4" Mar 12 17:14:00 crc kubenswrapper[5184]: I0312 17:14:00.310650 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:14:00 crc kubenswrapper[5184]: I0312 17:14:00.310928 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:14:00 crc kubenswrapper[5184]: I0312 17:14:00.312493 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:14:00 crc kubenswrapper[5184]: I0312 17:14:00.483424 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxl8r\" (UniqueName: \"kubernetes.io/projected/02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f-kube-api-access-bxl8r\") pod \"auto-csr-approver-29555594-6csj4\" (UID: \"02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f\") " pod="openshift-infra/auto-csr-approver-29555594-6csj4" Mar 12 17:14:00 crc kubenswrapper[5184]: I0312 17:14:00.586781 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxl8r\" (UniqueName: \"kubernetes.io/projected/02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f-kube-api-access-bxl8r\") pod \"auto-csr-approver-29555594-6csj4\" (UID: \"02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f\") " pod="openshift-infra/auto-csr-approver-29555594-6csj4" Mar 12 17:14:00 crc kubenswrapper[5184]: I0312 17:14:00.611099 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxl8r\" (UniqueName: \"kubernetes.io/projected/02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f-kube-api-access-bxl8r\") pod \"auto-csr-approver-29555594-6csj4\" (UID: \"02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f\") " pod="openshift-infra/auto-csr-approver-29555594-6csj4" Mar 12 17:14:00 crc kubenswrapper[5184]: I0312 17:14:00.634009 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555594-6csj4" Mar 12 17:14:01 crc kubenswrapper[5184]: I0312 17:14:01.141329 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555594-6csj4"] Mar 12 17:14:01 crc kubenswrapper[5184]: W0312 17:14:01.148616 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02d00a19_eeb1_4bc6_a42c_b2bd721cdf6f.slice/crio-90341a4aafb4f505102acadab18bc8457b9075a470644d0c8954870515ce7780 WatchSource:0}: Error finding container 90341a4aafb4f505102acadab18bc8457b9075a470644d0c8954870515ce7780: Status 404 returned error can't find the container with id 90341a4aafb4f505102acadab18bc8457b9075a470644d0c8954870515ce7780 Mar 12 17:14:01 crc kubenswrapper[5184]: I0312 17:14:01.366703 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555594-6csj4" event={"ID":"02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f","Type":"ContainerStarted","Data":"90341a4aafb4f505102acadab18bc8457b9075a470644d0c8954870515ce7780"} Mar 12 17:14:03 crc kubenswrapper[5184]: I0312 17:14:03.388659 5184 generic.go:358] "Generic (PLEG): container finished" podID="02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f" containerID="0b3952293217e4718a41a9999e52a1e27c9d2143a6b3ebc866a5030a19190c7a" exitCode=0 Mar 12 17:14:03 crc kubenswrapper[5184]: I0312 17:14:03.388764 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555594-6csj4" event={"ID":"02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f","Type":"ContainerDied","Data":"0b3952293217e4718a41a9999e52a1e27c9d2143a6b3ebc866a5030a19190c7a"} Mar 12 17:14:05 crc kubenswrapper[5184]: I0312 17:14:05.672297 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555594-6csj4" Mar 12 17:14:05 crc kubenswrapper[5184]: I0312 17:14:05.698271 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxl8r\" (UniqueName: \"kubernetes.io/projected/02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f-kube-api-access-bxl8r\") pod \"02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f\" (UID: \"02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f\") " Mar 12 17:14:05 crc kubenswrapper[5184]: I0312 17:14:05.705756 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f-kube-api-access-bxl8r" (OuterVolumeSpecName: "kube-api-access-bxl8r") pod "02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f" (UID: "02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f"). InnerVolumeSpecName "kube-api-access-bxl8r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:14:05 crc kubenswrapper[5184]: I0312 17:14:05.801077 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bxl8r\" (UniqueName: \"kubernetes.io/projected/02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f-kube-api-access-bxl8r\") on node \"crc\" DevicePath \"\"" Mar 12 17:14:06 crc kubenswrapper[5184]: I0312 17:14:06.429903 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555594-6csj4" Mar 12 17:14:06 crc kubenswrapper[5184]: I0312 17:14:06.429904 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555594-6csj4" event={"ID":"02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f","Type":"ContainerDied","Data":"90341a4aafb4f505102acadab18bc8457b9075a470644d0c8954870515ce7780"} Mar 12 17:14:06 crc kubenswrapper[5184]: I0312 17:14:06.430048 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90341a4aafb4f505102acadab18bc8457b9075a470644d0c8954870515ce7780" Mar 12 17:14:06 crc kubenswrapper[5184]: I0312 17:14:06.733388 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555588-xpxvn"] Mar 12 17:14:06 crc kubenswrapper[5184]: I0312 17:14:06.742276 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555588-xpxvn"] Mar 12 17:14:08 crc kubenswrapper[5184]: I0312 17:14:08.415453 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552cda96-d016-4ff4-9bc2-9cf835b31dfe" path="/var/lib/kubelet/pods/552cda96-d016-4ff4-9bc2-9cf835b31dfe/volumes" Mar 12 17:14:18 crc kubenswrapper[5184]: I0312 17:14:18.087491 5184 scope.go:117] "RemoveContainer" containerID="f41b5c6efce43c3a2aadec9482b88331de4149618247f6d48203d9b65bdccfad" Mar 12 17:14:18 crc kubenswrapper[5184]: I0312 17:14:18.118165 5184 scope.go:117] "RemoveContainer" containerID="46133b6ac7d76f48a753facdcf31a865d6acf3f1e3c4a6cdcf1b5951539c6354" Mar 12 17:14:18 crc kubenswrapper[5184]: I0312 17:14:18.167437 5184 scope.go:117] "RemoveContainer" containerID="2b059a0618396fd4c44a921ac9e8a2ef55e86f0811d130bfc354b1031772e615" Mar 12 17:14:18 crc kubenswrapper[5184]: I0312 17:14:18.248783 5184 scope.go:117] "RemoveContainer" containerID="91ec094eded41029257ca7ab2c02e818b22d55adde67347c58645cfc3ac9ff5e" Mar 12 17:14:18 crc kubenswrapper[5184]: I0312 17:14:18.364603 5184 scope.go:117] "RemoveContainer" containerID="834b98ec3a66b698c2809a195d4a511309154a427bf6f45a2a23c132afcd0771" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.838787 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4d5bl"] Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.840473 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f" containerName="oc" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.840488 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f" containerName="oc" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.840728 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f" containerName="oc" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.846723 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.856154 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4d5bl"] Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.861483 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-utilities\") pod \"redhat-operators-4d5bl\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.861539 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjg8s\" (UniqueName: \"kubernetes.io/projected/325a20e9-85f9-4a61-8441-faf56e9f7640-kube-api-access-gjg8s\") pod \"redhat-operators-4d5bl\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.861622 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-catalog-content\") pod \"redhat-operators-4d5bl\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.963172 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-utilities\") pod \"redhat-operators-4d5bl\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.963225 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg8s\" (UniqueName: \"kubernetes.io/projected/325a20e9-85f9-4a61-8441-faf56e9f7640-kube-api-access-gjg8s\") pod \"redhat-operators-4d5bl\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.963285 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-catalog-content\") pod \"redhat-operators-4d5bl\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.963737 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-catalog-content\") pod \"redhat-operators-4d5bl\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.963952 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-utilities\") pod \"redhat-operators-4d5bl\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:26 crc kubenswrapper[5184]: I0312 17:14:26.986132 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjg8s\" (UniqueName: \"kubernetes.io/projected/325a20e9-85f9-4a61-8441-faf56e9f7640-kube-api-access-gjg8s\") pod \"redhat-operators-4d5bl\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:27 crc kubenswrapper[5184]: I0312 17:14:27.206080 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:27 crc kubenswrapper[5184]: I0312 17:14:27.679527 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4d5bl"] Mar 12 17:14:27 crc kubenswrapper[5184]: W0312 17:14:27.680534 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod325a20e9_85f9_4a61_8441_faf56e9f7640.slice/crio-1f7e53514fb3107321a7a11ed4c309df2865ac4ced0a501d38103ddd630a7e0c WatchSource:0}: Error finding container 1f7e53514fb3107321a7a11ed4c309df2865ac4ced0a501d38103ddd630a7e0c: Status 404 returned error can't find the container with id 1f7e53514fb3107321a7a11ed4c309df2865ac4ced0a501d38103ddd630a7e0c Mar 12 17:14:29 crc kubenswrapper[5184]: I0312 17:14:29.383015 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4d5bl" event={"ID":"325a20e9-85f9-4a61-8441-faf56e9f7640","Type":"ContainerStarted","Data":"1f7e53514fb3107321a7a11ed4c309df2865ac4ced0a501d38103ddd630a7e0c"} Mar 12 17:14:30 crc kubenswrapper[5184]: I0312 17:14:30.393789 5184 generic.go:358] "Generic (PLEG): container finished" podID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerID="a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887" exitCode=0 Mar 12 17:14:30 crc kubenswrapper[5184]: I0312 17:14:30.393878 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4d5bl" event={"ID":"325a20e9-85f9-4a61-8441-faf56e9f7640","Type":"ContainerDied","Data":"a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887"} Mar 12 17:14:32 crc kubenswrapper[5184]: I0312 17:14:32.416137 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4d5bl" event={"ID":"325a20e9-85f9-4a61-8441-faf56e9f7640","Type":"ContainerStarted","Data":"db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22"} Mar 12 17:14:35 crc kubenswrapper[5184]: I0312 17:14:35.449540 5184 generic.go:358] "Generic (PLEG): container finished" podID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerID="db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22" exitCode=0 Mar 12 17:14:35 crc kubenswrapper[5184]: I0312 17:14:35.449626 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4d5bl" event={"ID":"325a20e9-85f9-4a61-8441-faf56e9f7640","Type":"ContainerDied","Data":"db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22"} Mar 12 17:14:36 crc kubenswrapper[5184]: I0312 17:14:36.461125 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4d5bl" event={"ID":"325a20e9-85f9-4a61-8441-faf56e9f7640","Type":"ContainerStarted","Data":"41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7"} Mar 12 17:14:36 crc kubenswrapper[5184]: I0312 17:14:36.482766 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4d5bl" podStartSLOduration=9.305961261 podStartE2EDuration="10.482742083s" podCreationTimestamp="2026-03-12 17:14:26 +0000 UTC" firstStartedPulling="2026-03-12 17:14:30.395589086 +0000 UTC m=+1412.936900455" lastFinishedPulling="2026-03-12 17:14:31.572369898 +0000 UTC m=+1414.113681277" observedRunningTime="2026-03-12 17:14:36.481737552 +0000 UTC m=+1419.023048931" watchObservedRunningTime="2026-03-12 17:14:36.482742083 +0000 UTC m=+1419.024053452" Mar 12 17:14:37 crc kubenswrapper[5184]: I0312 17:14:37.207159 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:37 crc kubenswrapper[5184]: I0312 17:14:37.207218 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:38 crc kubenswrapper[5184]: I0312 17:14:38.285573 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4d5bl" podUID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerName="registry-server" probeResult="failure" output=< Mar 12 17:14:38 crc kubenswrapper[5184]: timeout: failed to connect service ":50051" within 1s Mar 12 17:14:38 crc kubenswrapper[5184]: > Mar 12 17:14:47 crc kubenswrapper[5184]: I0312 17:14:47.253416 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:47 crc kubenswrapper[5184]: I0312 17:14:47.317603 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:47 crc kubenswrapper[5184]: I0312 17:14:47.492265 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4d5bl"] Mar 12 17:14:48 crc kubenswrapper[5184]: I0312 17:14:48.631122 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4d5bl" podUID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerName="registry-server" containerID="cri-o://41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7" gracePeriod=2 Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.088132 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:49 crc kubenswrapper[5184]: E0312 17:14:49.128106 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1870318 actualBytes=10240 Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.151003 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjg8s\" (UniqueName: \"kubernetes.io/projected/325a20e9-85f9-4a61-8441-faf56e9f7640-kube-api-access-gjg8s\") pod \"325a20e9-85f9-4a61-8441-faf56e9f7640\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.151102 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-utilities\") pod \"325a20e9-85f9-4a61-8441-faf56e9f7640\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.151189 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-catalog-content\") pod \"325a20e9-85f9-4a61-8441-faf56e9f7640\" (UID: \"325a20e9-85f9-4a61-8441-faf56e9f7640\") " Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.153302 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-utilities" (OuterVolumeSpecName: "utilities") pod "325a20e9-85f9-4a61-8441-faf56e9f7640" (UID: "325a20e9-85f9-4a61-8441-faf56e9f7640"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.159849 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/325a20e9-85f9-4a61-8441-faf56e9f7640-kube-api-access-gjg8s" (OuterVolumeSpecName: "kube-api-access-gjg8s") pod "325a20e9-85f9-4a61-8441-faf56e9f7640" (UID: "325a20e9-85f9-4a61-8441-faf56e9f7640"). InnerVolumeSpecName "kube-api-access-gjg8s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.253643 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.253683 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjg8s\" (UniqueName: \"kubernetes.io/projected/325a20e9-85f9-4a61-8441-faf56e9f7640-kube-api-access-gjg8s\") on node \"crc\" DevicePath \"\"" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.274018 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "325a20e9-85f9-4a61-8441-faf56e9f7640" (UID: "325a20e9-85f9-4a61-8441-faf56e9f7640"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.355574 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/325a20e9-85f9-4a61-8441-faf56e9f7640-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.643339 5184 generic.go:358] "Generic (PLEG): container finished" podID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerID="41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7" exitCode=0 Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.643569 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4d5bl" event={"ID":"325a20e9-85f9-4a61-8441-faf56e9f7640","Type":"ContainerDied","Data":"41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7"} Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.643598 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4d5bl" event={"ID":"325a20e9-85f9-4a61-8441-faf56e9f7640","Type":"ContainerDied","Data":"1f7e53514fb3107321a7a11ed4c309df2865ac4ced0a501d38103ddd630a7e0c"} Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.643614 5184 scope.go:117] "RemoveContainer" containerID="41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.643744 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4d5bl" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.695987 5184 scope.go:117] "RemoveContainer" containerID="db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.701830 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4d5bl"] Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.714018 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4d5bl"] Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.720414 5184 scope.go:117] "RemoveContainer" containerID="a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.780530 5184 scope.go:117] "RemoveContainer" containerID="41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7" Mar 12 17:14:49 crc kubenswrapper[5184]: E0312 17:14:49.780950 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7\": container with ID starting with 41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7 not found: ID does not exist" containerID="41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.781012 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7"} err="failed to get container status \"41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7\": rpc error: code = NotFound desc = could not find container \"41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7\": container with ID starting with 41866b29562b63f4add59677e393560b094d221ae11f930cb507417c496a74f7 not found: ID does not exist" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.781044 5184 scope.go:117] "RemoveContainer" containerID="db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22" Mar 12 17:14:49 crc kubenswrapper[5184]: E0312 17:14:49.781366 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22\": container with ID starting with db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22 not found: ID does not exist" containerID="db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.781531 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22"} err="failed to get container status \"db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22\": rpc error: code = NotFound desc = could not find container \"db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22\": container with ID starting with db689fdfe684307f818e2c78914932c87e6f645a2f93fa054c0d163e5d94dd22 not found: ID does not exist" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.781649 5184 scope.go:117] "RemoveContainer" containerID="a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887" Mar 12 17:14:49 crc kubenswrapper[5184]: E0312 17:14:49.782078 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887\": container with ID starting with a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887 not found: ID does not exist" containerID="a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887" Mar 12 17:14:49 crc kubenswrapper[5184]: I0312 17:14:49.782112 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887"} err="failed to get container status \"a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887\": rpc error: code = NotFound desc = could not find container \"a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887\": container with ID starting with a2bfebf52aa101ab04d5f51ef76391bc12adc6fa87531e96712b6ba927398887 not found: ID does not exist" Mar 12 17:14:50 crc kubenswrapper[5184]: I0312 17:14:50.412697 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325a20e9-85f9-4a61-8441-faf56e9f7640" path="/var/lib/kubelet/pods/325a20e9-85f9-4a61-8441-faf56e9f7640/volumes" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.144163 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg"] Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.146299 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerName="extract-utilities" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.146323 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerName="extract-utilities" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.146403 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerName="extract-content" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.146414 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerName="extract-content" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.146692 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerName="registry-server" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.146710 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerName="registry-server" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.146929 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="325a20e9-85f9-4a61-8441-faf56e9f7640" containerName="registry-server" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.163845 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg"] Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.164052 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.170337 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.170498 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.184963 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65nkw\" (UniqueName: \"kubernetes.io/projected/a4f5ef80-5f98-478e-a2da-1c4c03a78476-kube-api-access-65nkw\") pod \"collect-profiles-29555595-4qshg\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.185275 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4f5ef80-5f98-478e-a2da-1c4c03a78476-secret-volume\") pod \"collect-profiles-29555595-4qshg\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.185421 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4f5ef80-5f98-478e-a2da-1c4c03a78476-config-volume\") pod \"collect-profiles-29555595-4qshg\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.287064 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65nkw\" (UniqueName: \"kubernetes.io/projected/a4f5ef80-5f98-478e-a2da-1c4c03a78476-kube-api-access-65nkw\") pod \"collect-profiles-29555595-4qshg\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.287538 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4f5ef80-5f98-478e-a2da-1c4c03a78476-secret-volume\") pod \"collect-profiles-29555595-4qshg\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.287802 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4f5ef80-5f98-478e-a2da-1c4c03a78476-config-volume\") pod \"collect-profiles-29555595-4qshg\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.289052 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4f5ef80-5f98-478e-a2da-1c4c03a78476-config-volume\") pod \"collect-profiles-29555595-4qshg\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.299198 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4f5ef80-5f98-478e-a2da-1c4c03a78476-secret-volume\") pod \"collect-profiles-29555595-4qshg\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.306042 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65nkw\" (UniqueName: \"kubernetes.io/projected/a4f5ef80-5f98-478e-a2da-1c4c03a78476-kube-api-access-65nkw\") pod \"collect-profiles-29555595-4qshg\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.484638 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:00 crc kubenswrapper[5184]: I0312 17:15:00.947824 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg"] Mar 12 17:15:01 crc kubenswrapper[5184]: I0312 17:15:01.768183 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" event={"ID":"a4f5ef80-5f98-478e-a2da-1c4c03a78476","Type":"ContainerStarted","Data":"3fce6bd6bd32130ff3daa29a892a36d66a51b86a3156bfd74503747cc7b47032"} Mar 12 17:15:01 crc kubenswrapper[5184]: I0312 17:15:01.768659 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" event={"ID":"a4f5ef80-5f98-478e-a2da-1c4c03a78476","Type":"ContainerStarted","Data":"1f6d802df7318aaf7a48b997108c0772114769796af2510bd72cd2799537c324"} Mar 12 17:15:01 crc kubenswrapper[5184]: I0312 17:15:01.788056 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" podStartSLOduration=1.788035651 podStartE2EDuration="1.788035651s" podCreationTimestamp="2026-03-12 17:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 17:15:01.78222167 +0000 UTC m=+1444.323533029" watchObservedRunningTime="2026-03-12 17:15:01.788035651 +0000 UTC m=+1444.329346990" Mar 12 17:15:02 crc kubenswrapper[5184]: I0312 17:15:02.779493 5184 generic.go:358] "Generic (PLEG): container finished" podID="a4f5ef80-5f98-478e-a2da-1c4c03a78476" containerID="3fce6bd6bd32130ff3daa29a892a36d66a51b86a3156bfd74503747cc7b47032" exitCode=0 Mar 12 17:15:02 crc kubenswrapper[5184]: I0312 17:15:02.779568 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" event={"ID":"a4f5ef80-5f98-478e-a2da-1c4c03a78476","Type":"ContainerDied","Data":"3fce6bd6bd32130ff3daa29a892a36d66a51b86a3156bfd74503747cc7b47032"} Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.110924 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.165044 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4f5ef80-5f98-478e-a2da-1c4c03a78476-config-volume\") pod \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.165102 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4f5ef80-5f98-478e-a2da-1c4c03a78476-secret-volume\") pod \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.165177 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65nkw\" (UniqueName: \"kubernetes.io/projected/a4f5ef80-5f98-478e-a2da-1c4c03a78476-kube-api-access-65nkw\") pod \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\" (UID: \"a4f5ef80-5f98-478e-a2da-1c4c03a78476\") " Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.168127 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f5ef80-5f98-478e-a2da-1c4c03a78476-config-volume" (OuterVolumeSpecName: "config-volume") pod "a4f5ef80-5f98-478e-a2da-1c4c03a78476" (UID: "a4f5ef80-5f98-478e-a2da-1c4c03a78476"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.172147 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f5ef80-5f98-478e-a2da-1c4c03a78476-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a4f5ef80-5f98-478e-a2da-1c4c03a78476" (UID: "a4f5ef80-5f98-478e-a2da-1c4c03a78476"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.172700 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f5ef80-5f98-478e-a2da-1c4c03a78476-kube-api-access-65nkw" (OuterVolumeSpecName: "kube-api-access-65nkw") pod "a4f5ef80-5f98-478e-a2da-1c4c03a78476" (UID: "a4f5ef80-5f98-478e-a2da-1c4c03a78476"). InnerVolumeSpecName "kube-api-access-65nkw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.267181 5184 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4f5ef80-5f98-478e-a2da-1c4c03a78476-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.267438 5184 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4f5ef80-5f98-478e-a2da-1c4c03a78476-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.267450 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65nkw\" (UniqueName: \"kubernetes.io/projected/a4f5ef80-5f98-478e-a2da-1c4c03a78476-kube-api-access-65nkw\") on node \"crc\" DevicePath \"\"" Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.801096 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.801091 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555595-4qshg" event={"ID":"a4f5ef80-5f98-478e-a2da-1c4c03a78476","Type":"ContainerDied","Data":"1f6d802df7318aaf7a48b997108c0772114769796af2510bd72cd2799537c324"} Mar 12 17:15:04 crc kubenswrapper[5184]: I0312 17:15:04.801268 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f6d802df7318aaf7a48b997108c0772114769796af2510bd72cd2799537c324" Mar 12 17:15:49 crc kubenswrapper[5184]: E0312 17:15:49.363693 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1794084 actualBytes=10240 Mar 12 17:15:50 crc kubenswrapper[5184]: I0312 17:15:50.742415 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:15:50 crc kubenswrapper[5184]: I0312 17:15:50.742821 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:15:59 crc kubenswrapper[5184]: I0312 17:15:59.111752 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:15:59 crc kubenswrapper[5184]: I0312 17:15:59.117690 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:15:59 crc kubenswrapper[5184]: I0312 17:15:59.119369 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:15:59 crc kubenswrapper[5184]: I0312 17:15:59.125165 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.149936 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555596-t4wg8"] Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.151694 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4f5ef80-5f98-478e-a2da-1c4c03a78476" containerName="collect-profiles" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.151717 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f5ef80-5f98-478e-a2da-1c4c03a78476" containerName="collect-profiles" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.151931 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4f5ef80-5f98-478e-a2da-1c4c03a78476" containerName="collect-profiles" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.157239 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555596-t4wg8" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.159554 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.159898 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.163028 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.174199 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555596-t4wg8"] Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.284187 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrm6\" (UniqueName: \"kubernetes.io/projected/f69d7985-6165-4cb8-8e7a-8ffd819b0243-kube-api-access-cnrm6\") pod \"auto-csr-approver-29555596-t4wg8\" (UID: \"f69d7985-6165-4cb8-8e7a-8ffd819b0243\") " pod="openshift-infra/auto-csr-approver-29555596-t4wg8" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.386893 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrm6\" (UniqueName: \"kubernetes.io/projected/f69d7985-6165-4cb8-8e7a-8ffd819b0243-kube-api-access-cnrm6\") pod \"auto-csr-approver-29555596-t4wg8\" (UID: \"f69d7985-6165-4cb8-8e7a-8ffd819b0243\") " pod="openshift-infra/auto-csr-approver-29555596-t4wg8" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.413409 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrm6\" (UniqueName: \"kubernetes.io/projected/f69d7985-6165-4cb8-8e7a-8ffd819b0243-kube-api-access-cnrm6\") pod \"auto-csr-approver-29555596-t4wg8\" (UID: \"f69d7985-6165-4cb8-8e7a-8ffd819b0243\") " pod="openshift-infra/auto-csr-approver-29555596-t4wg8" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.477533 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555596-t4wg8" Mar 12 17:16:00 crc kubenswrapper[5184]: I0312 17:16:00.986642 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555596-t4wg8"] Mar 12 17:16:01 crc kubenswrapper[5184]: I0312 17:16:01.699500 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555596-t4wg8" event={"ID":"f69d7985-6165-4cb8-8e7a-8ffd819b0243","Type":"ContainerStarted","Data":"652d9993edac46b84e80820c81f2c188337afb5c82af9d5175d75db67ce16150"} Mar 12 17:16:02 crc kubenswrapper[5184]: I0312 17:16:02.710269 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555596-t4wg8" event={"ID":"f69d7985-6165-4cb8-8e7a-8ffd819b0243","Type":"ContainerStarted","Data":"170d47a303bdf2f646b461a158ec79a92ff1a8894b234eae9945aeee2bfb99f0"} Mar 12 17:16:02 crc kubenswrapper[5184]: I0312 17:16:02.728063 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555596-t4wg8" podStartSLOduration=1.5185229420000002 podStartE2EDuration="2.72803602s" podCreationTimestamp="2026-03-12 17:16:00 +0000 UTC" firstStartedPulling="2026-03-12 17:16:00.972287382 +0000 UTC m=+1503.513598761" lastFinishedPulling="2026-03-12 17:16:02.1818005 +0000 UTC m=+1504.723111839" observedRunningTime="2026-03-12 17:16:02.722995742 +0000 UTC m=+1505.264307101" watchObservedRunningTime="2026-03-12 17:16:02.72803602 +0000 UTC m=+1505.269347359" Mar 12 17:16:03 crc kubenswrapper[5184]: I0312 17:16:03.726368 5184 generic.go:358] "Generic (PLEG): container finished" podID="f69d7985-6165-4cb8-8e7a-8ffd819b0243" containerID="170d47a303bdf2f646b461a158ec79a92ff1a8894b234eae9945aeee2bfb99f0" exitCode=0 Mar 12 17:16:03 crc kubenswrapper[5184]: I0312 17:16:03.726740 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555596-t4wg8" event={"ID":"f69d7985-6165-4cb8-8e7a-8ffd819b0243","Type":"ContainerDied","Data":"170d47a303bdf2f646b461a158ec79a92ff1a8894b234eae9945aeee2bfb99f0"} Mar 12 17:16:05 crc kubenswrapper[5184]: I0312 17:16:05.128563 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555596-t4wg8" Mar 12 17:16:05 crc kubenswrapper[5184]: I0312 17:16:05.290924 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnrm6\" (UniqueName: \"kubernetes.io/projected/f69d7985-6165-4cb8-8e7a-8ffd819b0243-kube-api-access-cnrm6\") pod \"f69d7985-6165-4cb8-8e7a-8ffd819b0243\" (UID: \"f69d7985-6165-4cb8-8e7a-8ffd819b0243\") " Mar 12 17:16:05 crc kubenswrapper[5184]: I0312 17:16:05.303826 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f69d7985-6165-4cb8-8e7a-8ffd819b0243-kube-api-access-cnrm6" (OuterVolumeSpecName: "kube-api-access-cnrm6") pod "f69d7985-6165-4cb8-8e7a-8ffd819b0243" (UID: "f69d7985-6165-4cb8-8e7a-8ffd819b0243"). InnerVolumeSpecName "kube-api-access-cnrm6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:16:05 crc kubenswrapper[5184]: I0312 17:16:05.393573 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cnrm6\" (UniqueName: \"kubernetes.io/projected/f69d7985-6165-4cb8-8e7a-8ffd819b0243-kube-api-access-cnrm6\") on node \"crc\" DevicePath \"\"" Mar 12 17:16:05 crc kubenswrapper[5184]: I0312 17:16:05.772851 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555596-t4wg8" event={"ID":"f69d7985-6165-4cb8-8e7a-8ffd819b0243","Type":"ContainerDied","Data":"652d9993edac46b84e80820c81f2c188337afb5c82af9d5175d75db67ce16150"} Mar 12 17:16:05 crc kubenswrapper[5184]: I0312 17:16:05.773152 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="652d9993edac46b84e80820c81f2c188337afb5c82af9d5175d75db67ce16150" Mar 12 17:16:05 crc kubenswrapper[5184]: I0312 17:16:05.773231 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555596-t4wg8" Mar 12 17:16:05 crc kubenswrapper[5184]: I0312 17:16:05.812120 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555590-r6pk7"] Mar 12 17:16:05 crc kubenswrapper[5184]: I0312 17:16:05.823596 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555590-r6pk7"] Mar 12 17:16:06 crc kubenswrapper[5184]: I0312 17:16:06.411522 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b7c2ee-47aa-47cb-9360-432c7da6513c" path="/var/lib/kubelet/pods/65b7c2ee-47aa-47cb-9360-432c7da6513c/volumes" Mar 12 17:16:18 crc kubenswrapper[5184]: I0312 17:16:18.586745 5184 scope.go:117] "RemoveContainer" containerID="e99c672f28b88118c683cf8842e8dc7a203447c475f4d5aa07ac60bf0d11aa55" Mar 12 17:16:20 crc kubenswrapper[5184]: I0312 17:16:20.742076 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:16:20 crc kubenswrapper[5184]: I0312 17:16:20.742579 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:16:20 crc kubenswrapper[5184]: I0312 17:16:20.942436 5184 generic.go:358] "Generic (PLEG): container finished" podID="00325ca6-5bba-4ac7-8ef7-0b21163fe2af" containerID="6d8a5d243e7a8e9238416e7eb516fd42474b99f8b9dbe0824ed08e85aeedd45e" exitCode=0 Mar 12 17:16:20 crc kubenswrapper[5184]: I0312 17:16:20.942577 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" event={"ID":"00325ca6-5bba-4ac7-8ef7-0b21163fe2af","Type":"ContainerDied","Data":"6d8a5d243e7a8e9238416e7eb516fd42474b99f8b9dbe0824ed08e85aeedd45e"} Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.619262 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.716919 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-ssh-key-openstack-edpm-ipam\") pod \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.717253 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp5lr\" (UniqueName: \"kubernetes.io/projected/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-kube-api-access-kp5lr\") pod \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.717440 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-inventory\") pod \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.717580 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-bootstrap-combined-ca-bundle\") pod \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\" (UID: \"00325ca6-5bba-4ac7-8ef7-0b21163fe2af\") " Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.723551 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "00325ca6-5bba-4ac7-8ef7-0b21163fe2af" (UID: "00325ca6-5bba-4ac7-8ef7-0b21163fe2af"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.723546 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-kube-api-access-kp5lr" (OuterVolumeSpecName: "kube-api-access-kp5lr") pod "00325ca6-5bba-4ac7-8ef7-0b21163fe2af" (UID: "00325ca6-5bba-4ac7-8ef7-0b21163fe2af"). InnerVolumeSpecName "kube-api-access-kp5lr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.746609 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "00325ca6-5bba-4ac7-8ef7-0b21163fe2af" (UID: "00325ca6-5bba-4ac7-8ef7-0b21163fe2af"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.774129 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-inventory" (OuterVolumeSpecName: "inventory") pod "00325ca6-5bba-4ac7-8ef7-0b21163fe2af" (UID: "00325ca6-5bba-4ac7-8ef7-0b21163fe2af"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.820773 5184 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.820807 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kp5lr\" (UniqueName: \"kubernetes.io/projected/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-kube-api-access-kp5lr\") on node \"crc\" DevicePath \"\"" Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.820816 5184 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.820824 5184 reconciler_common.go:299] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00325ca6-5bba-4ac7-8ef7-0b21163fe2af-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.963365 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.963482 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb" event={"ID":"00325ca6-5bba-4ac7-8ef7-0b21163fe2af","Type":"ContainerDied","Data":"7c64015948fc430d3fc67bafaa5c75d73842ea8057c1f66c3e8f0d761f129ee5"} Mar 12 17:16:22 crc kubenswrapper[5184]: I0312 17:16:22.963534 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c64015948fc430d3fc67bafaa5c75d73842ea8057c1f66c3e8f0d761f129ee5" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.060289 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4"] Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.062063 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00325ca6-5bba-4ac7-8ef7-0b21163fe2af" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.062102 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="00325ca6-5bba-4ac7-8ef7-0b21163fe2af" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.062143 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f69d7985-6165-4cb8-8e7a-8ffd819b0243" containerName="oc" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.062153 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69d7985-6165-4cb8-8e7a-8ffd819b0243" containerName="oc" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.062452 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="00325ca6-5bba-4ac7-8ef7-0b21163fe2af" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.062491 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="f69d7985-6165-4cb8-8e7a-8ffd819b0243" containerName="oc" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.071311 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4"] Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.071461 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.074199 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-qr8nl\"" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.075553 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\"" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.075781 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.075805 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\"" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.228399 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5msp4\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.228468 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s22qv\" (UniqueName: \"kubernetes.io/projected/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-kube-api-access-s22qv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5msp4\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.228883 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5msp4\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.330705 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5msp4\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.330838 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5msp4\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.330882 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s22qv\" (UniqueName: \"kubernetes.io/projected/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-kube-api-access-s22qv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5msp4\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.334982 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5msp4\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.335220 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5msp4\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.348680 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s22qv\" (UniqueName: \"kubernetes.io/projected/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-kube-api-access-s22qv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5msp4\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.406136 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.944359 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4"] Mar 12 17:16:23 crc kubenswrapper[5184]: I0312 17:16:23.975777 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" event={"ID":"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78","Type":"ContainerStarted","Data":"282f98513aee94dc32ffd80e7410553533eb0ffada994f9d7dd515e4fa697aae"} Mar 12 17:16:26 crc kubenswrapper[5184]: I0312 17:16:26.032354 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" event={"ID":"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78","Type":"ContainerStarted","Data":"17e3c9fcc0b066443bcb297181a20997214688f65ec8f3ecc0d968137714fc02"} Mar 12 17:16:26 crc kubenswrapper[5184]: I0312 17:16:26.072577 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" podStartSLOduration=2.07495893 podStartE2EDuration="3.072547132s" podCreationTimestamp="2026-03-12 17:16:23 +0000 UTC" firstStartedPulling="2026-03-12 17:16:23.947530788 +0000 UTC m=+1526.488842127" lastFinishedPulling="2026-03-12 17:16:24.94511898 +0000 UTC m=+1527.486430329" observedRunningTime="2026-03-12 17:16:26.056852482 +0000 UTC m=+1528.598163811" watchObservedRunningTime="2026-03-12 17:16:26.072547132 +0000 UTC m=+1528.613858471" Mar 12 17:16:49 crc kubenswrapper[5184]: E0312 17:16:49.370816 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1792831 actualBytes=10240 Mar 12 17:16:50 crc kubenswrapper[5184]: I0312 17:16:50.742630 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:16:50 crc kubenswrapper[5184]: I0312 17:16:50.743024 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:16:50 crc kubenswrapper[5184]: I0312 17:16:50.743089 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 17:16:50 crc kubenswrapper[5184]: I0312 17:16:50.744151 5184 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7539a33836cf02bc296a008cedc3ee58f1ef87b38ca5b5f9414731708b87618f"} pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 17:16:50 crc kubenswrapper[5184]: I0312 17:16:50.744249 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" containerID="cri-o://7539a33836cf02bc296a008cedc3ee58f1ef87b38ca5b5f9414731708b87618f" gracePeriod=600 Mar 12 17:16:51 crc kubenswrapper[5184]: I0312 17:16:51.833670 5184 generic.go:358] "Generic (PLEG): container finished" podID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerID="7539a33836cf02bc296a008cedc3ee58f1ef87b38ca5b5f9414731708b87618f" exitCode=0 Mar 12 17:16:51 crc kubenswrapper[5184]: I0312 17:16:51.833745 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerDied","Data":"7539a33836cf02bc296a008cedc3ee58f1ef87b38ca5b5f9414731708b87618f"} Mar 12 17:16:51 crc kubenswrapper[5184]: I0312 17:16:51.834360 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924"} Mar 12 17:16:51 crc kubenswrapper[5184]: I0312 17:16:51.834428 5184 scope.go:117] "RemoveContainer" containerID="42ed46ee5dbf0d27675a5969e00cdc1d30283a154524122596ea10898f42720f" Mar 12 17:17:35 crc kubenswrapper[5184]: I0312 17:17:35.607437 5184 generic.go:358] "Generic (PLEG): container finished" podID="0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78" containerID="17e3c9fcc0b066443bcb297181a20997214688f65ec8f3ecc0d968137714fc02" exitCode=0 Mar 12 17:17:35 crc kubenswrapper[5184]: I0312 17:17:35.607586 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" event={"ID":"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78","Type":"ContainerDied","Data":"17e3c9fcc0b066443bcb297181a20997214688f65ec8f3ecc0d968137714fc02"} Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.145590 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.259866 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-inventory\") pod \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.261087 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-ssh-key-openstack-edpm-ipam\") pod \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.261827 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s22qv\" (UniqueName: \"kubernetes.io/projected/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-kube-api-access-s22qv\") pod \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\" (UID: \"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78\") " Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.267129 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-kube-api-access-s22qv" (OuterVolumeSpecName: "kube-api-access-s22qv") pod "0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78" (UID: "0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78"). InnerVolumeSpecName "kube-api-access-s22qv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.296060 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-inventory" (OuterVolumeSpecName: "inventory") pod "0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78" (UID: "0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.306365 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78" (UID: "0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.367616 5184 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.367728 5184 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.367765 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s22qv\" (UniqueName: \"kubernetes.io/projected/0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78-kube-api-access-s22qv\") on node \"crc\" DevicePath \"\"" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.638905 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" event={"ID":"0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78","Type":"ContainerDied","Data":"282f98513aee94dc32ffd80e7410553533eb0ffada994f9d7dd515e4fa697aae"} Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.638964 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="282f98513aee94dc32ffd80e7410553533eb0ffada994f9d7dd515e4fa697aae" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.638921 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5msp4" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.746745 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2"] Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.747900 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.747925 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.748129 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.755641 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.758423 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\"" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.758826 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.760361 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2"] Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.763784 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-qr8nl\"" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.768566 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\"" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.880631 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl4z2\" (UniqueName: \"kubernetes.io/projected/7239abca-a6d9-4694-8cf0-36bd97160cf9-kube-api-access-xl4z2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tszw2\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.881148 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tszw2\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.881183 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tszw2\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.983406 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl4z2\" (UniqueName: \"kubernetes.io/projected/7239abca-a6d9-4694-8cf0-36bd97160cf9-kube-api-access-xl4z2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tszw2\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.983736 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tszw2\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.983797 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tszw2\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.989722 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tszw2\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:37 crc kubenswrapper[5184]: I0312 17:17:37.989945 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tszw2\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:38 crc kubenswrapper[5184]: I0312 17:17:38.018822 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl4z2\" (UniqueName: \"kubernetes.io/projected/7239abca-a6d9-4694-8cf0-36bd97160cf9-kube-api-access-xl4z2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tszw2\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:38 crc kubenswrapper[5184]: I0312 17:17:38.073682 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-f7e4-account-create-update-jbqtf"] Mar 12 17:17:38 crc kubenswrapper[5184]: I0312 17:17:38.077239 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:38 crc kubenswrapper[5184]: I0312 17:17:38.094637 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-rxnwh"] Mar 12 17:17:38 crc kubenswrapper[5184]: I0312 17:17:38.106626 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f7e4-account-create-update-jbqtf"] Mar 12 17:17:38 crc kubenswrapper[5184]: I0312 17:17:38.126802 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-rxnwh"] Mar 12 17:17:38 crc kubenswrapper[5184]: I0312 17:17:38.414295 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b0585b6-5451-4f29-a11c-8d84143e3589" path="/var/lib/kubelet/pods/1b0585b6-5451-4f29-a11c-8d84143e3589/volumes" Mar 12 17:17:38 crc kubenswrapper[5184]: I0312 17:17:38.415934 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94bbecf2-f5e8-4513-a4c6-559d752aae55" path="/var/lib/kubelet/pods/94bbecf2-f5e8-4513-a4c6-559d752aae55/volumes" Mar 12 17:17:38 crc kubenswrapper[5184]: I0312 17:17:38.787109 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2"] Mar 12 17:17:38 crc kubenswrapper[5184]: I0312 17:17:38.821149 5184 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 17:17:39 crc kubenswrapper[5184]: I0312 17:17:39.662508 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" event={"ID":"7239abca-a6d9-4694-8cf0-36bd97160cf9","Type":"ContainerStarted","Data":"d1969e513eb747a20ef29ce290c926fbc02a4a212491c1cdf428e8a49e4359e1"} Mar 12 17:17:39 crc kubenswrapper[5184]: I0312 17:17:39.663028 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" event={"ID":"7239abca-a6d9-4694-8cf0-36bd97160cf9","Type":"ContainerStarted","Data":"ed69476fcf1e52ff8ed4abc6e0ebe29a1808cd1fbe4dffa7e9a2b4d065272a39"} Mar 12 17:17:39 crc kubenswrapper[5184]: I0312 17:17:39.682085 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" podStartSLOduration=2.188826095 podStartE2EDuration="2.682060274s" podCreationTimestamp="2026-03-12 17:17:37 +0000 UTC" firstStartedPulling="2026-03-12 17:17:38.821362951 +0000 UTC m=+1601.362674290" lastFinishedPulling="2026-03-12 17:17:39.31459708 +0000 UTC m=+1601.855908469" observedRunningTime="2026-03-12 17:17:39.6806343 +0000 UTC m=+1602.221945649" watchObservedRunningTime="2026-03-12 17:17:39.682060274 +0000 UTC m=+1602.223371653" Mar 12 17:17:41 crc kubenswrapper[5184]: I0312 17:17:41.042998 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2da3-account-create-update-jbb94"] Mar 12 17:17:41 crc kubenswrapper[5184]: I0312 17:17:41.059536 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-lf25b"] Mar 12 17:17:41 crc kubenswrapper[5184]: I0312 17:17:41.069238 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2da3-account-create-update-jbb94"] Mar 12 17:17:41 crc kubenswrapper[5184]: I0312 17:17:41.079993 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-lf25b"] Mar 12 17:17:42 crc kubenswrapper[5184]: I0312 17:17:42.049936 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-sqldm"] Mar 12 17:17:42 crc kubenswrapper[5184]: I0312 17:17:42.070016 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-sqldm"] Mar 12 17:17:42 crc kubenswrapper[5184]: I0312 17:17:42.084015 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-c88c-account-create-update-zlssx"] Mar 12 17:17:42 crc kubenswrapper[5184]: I0312 17:17:42.094752 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c88c-account-create-update-zlssx"] Mar 12 17:17:42 crc kubenswrapper[5184]: I0312 17:17:42.422249 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="041f7579-fdb5-43db-9291-318597c8c028" path="/var/lib/kubelet/pods/041f7579-fdb5-43db-9291-318597c8c028/volumes" Mar 12 17:17:42 crc kubenswrapper[5184]: I0312 17:17:42.423439 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe07d76-6af2-408e-a77d-45434eaa4eb3" path="/var/lib/kubelet/pods/afe07d76-6af2-408e-a77d-45434eaa4eb3/volumes" Mar 12 17:17:42 crc kubenswrapper[5184]: I0312 17:17:42.424500 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae2737b-02b8-46f4-9762-842b44b6b506" path="/var/lib/kubelet/pods/bae2737b-02b8-46f4-9762-842b44b6b506/volumes" Mar 12 17:17:42 crc kubenswrapper[5184]: I0312 17:17:42.425601 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5d12b8-0fdd-4998-8dbd-8df30df4af5b" path="/var/lib/kubelet/pods/bd5d12b8-0fdd-4998-8dbd-8df30df4af5b/volumes" Mar 12 17:17:44 crc kubenswrapper[5184]: I0312 17:17:44.741240 5184 generic.go:358] "Generic (PLEG): container finished" podID="7239abca-a6d9-4694-8cf0-36bd97160cf9" containerID="d1969e513eb747a20ef29ce290c926fbc02a4a212491c1cdf428e8a49e4359e1" exitCode=0 Mar 12 17:17:44 crc kubenswrapper[5184]: I0312 17:17:44.741362 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" event={"ID":"7239abca-a6d9-4694-8cf0-36bd97160cf9","Type":"ContainerDied","Data":"d1969e513eb747a20ef29ce290c926fbc02a4a212491c1cdf428e8a49e4359e1"} Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.243508 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.268267 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl4z2\" (UniqueName: \"kubernetes.io/projected/7239abca-a6d9-4694-8cf0-36bd97160cf9-kube-api-access-xl4z2\") pod \"7239abca-a6d9-4694-8cf0-36bd97160cf9\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.268312 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-inventory\") pod \"7239abca-a6d9-4694-8cf0-36bd97160cf9\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.268406 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-ssh-key-openstack-edpm-ipam\") pod \"7239abca-a6d9-4694-8cf0-36bd97160cf9\" (UID: \"7239abca-a6d9-4694-8cf0-36bd97160cf9\") " Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.291750 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7239abca-a6d9-4694-8cf0-36bd97160cf9-kube-api-access-xl4z2" (OuterVolumeSpecName: "kube-api-access-xl4z2") pod "7239abca-a6d9-4694-8cf0-36bd97160cf9" (UID: "7239abca-a6d9-4694-8cf0-36bd97160cf9"). InnerVolumeSpecName "kube-api-access-xl4z2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.302487 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-inventory" (OuterVolumeSpecName: "inventory") pod "7239abca-a6d9-4694-8cf0-36bd97160cf9" (UID: "7239abca-a6d9-4694-8cf0-36bd97160cf9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.302661 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7239abca-a6d9-4694-8cf0-36bd97160cf9" (UID: "7239abca-a6d9-4694-8cf0-36bd97160cf9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.372687 5184 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.372731 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xl4z2\" (UniqueName: \"kubernetes.io/projected/7239abca-a6d9-4694-8cf0-36bd97160cf9-kube-api-access-xl4z2\") on node \"crc\" DevicePath \"\"" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.372764 5184 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7239abca-a6d9-4694-8cf0-36bd97160cf9-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.774538 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" event={"ID":"7239abca-a6d9-4694-8cf0-36bd97160cf9","Type":"ContainerDied","Data":"ed69476fcf1e52ff8ed4abc6e0ebe29a1808cd1fbe4dffa7e9a2b4d065272a39"} Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.774605 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed69476fcf1e52ff8ed4abc6e0ebe29a1808cd1fbe4dffa7e9a2b4d065272a39" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.774717 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tszw2" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.849842 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf"] Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.851621 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7239abca-a6d9-4694-8cf0-36bd97160cf9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.851654 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="7239abca-a6d9-4694-8cf0-36bd97160cf9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.851906 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="7239abca-a6d9-4694-8cf0-36bd97160cf9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.858865 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.862499 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf"] Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.863188 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.863470 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-qr8nl\"" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.863499 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\"" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.863798 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\"" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.885729 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rlmmf\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.885857 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnk9f\" (UniqueName: \"kubernetes.io/projected/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-kube-api-access-qnk9f\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rlmmf\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.886052 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rlmmf\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.987703 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rlmmf\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.987780 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rlmmf\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.987837 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnk9f\" (UniqueName: \"kubernetes.io/projected/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-kube-api-access-qnk9f\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rlmmf\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.994568 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rlmmf\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:46 crc kubenswrapper[5184]: I0312 17:17:46.995027 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rlmmf\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:47 crc kubenswrapper[5184]: I0312 17:17:47.005934 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnk9f\" (UniqueName: \"kubernetes.io/projected/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-kube-api-access-qnk9f\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rlmmf\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:47 crc kubenswrapper[5184]: I0312 17:17:47.225168 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:17:47 crc kubenswrapper[5184]: I0312 17:17:47.788082 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf"] Mar 12 17:17:48 crc kubenswrapper[5184]: I0312 17:17:48.799962 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" event={"ID":"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f","Type":"ContainerStarted","Data":"77836d6ed2b421ef5ec2239e2af4643d9cc149f9ac53f4bdb18826df9d61e5f9"} Mar 12 17:17:49 crc kubenswrapper[5184]: I0312 17:17:49.027719 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-nwvrf"] Mar 12 17:17:49 crc kubenswrapper[5184]: I0312 17:17:49.037952 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-nwvrf"] Mar 12 17:17:49 crc kubenswrapper[5184]: E0312 17:17:49.299274 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1792759 actualBytes=10240 Mar 12 17:17:49 crc kubenswrapper[5184]: I0312 17:17:49.809854 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" event={"ID":"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f","Type":"ContainerStarted","Data":"c9e577cce40d39b90d16b69cdbf1b5013d56125cd7e1643855e398608090458d"} Mar 12 17:17:49 crc kubenswrapper[5184]: I0312 17:17:49.829509 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" podStartSLOduration=3.089533648 podStartE2EDuration="3.829487691s" podCreationTimestamp="2026-03-12 17:17:46 +0000 UTC" firstStartedPulling="2026-03-12 17:17:47.78724023 +0000 UTC m=+1610.328551569" lastFinishedPulling="2026-03-12 17:17:48.527194273 +0000 UTC m=+1611.068505612" observedRunningTime="2026-03-12 17:17:49.821831394 +0000 UTC m=+1612.363142743" watchObservedRunningTime="2026-03-12 17:17:49.829487691 +0000 UTC m=+1612.370799030" Mar 12 17:17:50 crc kubenswrapper[5184]: I0312 17:17:50.411926 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d85d1216-e8c4-45ea-8e85-bf33cece093c" path="/var/lib/kubelet/pods/d85d1216-e8c4-45ea-8e85-bf33cece093c/volumes" Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.145796 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555598-xczd8"] Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.159278 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555598-xczd8" Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.165858 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.166193 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.169849 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555598-xczd8"] Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.169996 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.286255 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7tgz\" (UniqueName: \"kubernetes.io/projected/c1a51aa6-a692-479e-a4d1-e8960e7e4e6f-kube-api-access-n7tgz\") pod \"auto-csr-approver-29555598-xczd8\" (UID: \"c1a51aa6-a692-479e-a4d1-e8960e7e4e6f\") " pod="openshift-infra/auto-csr-approver-29555598-xczd8" Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.388462 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7tgz\" (UniqueName: \"kubernetes.io/projected/c1a51aa6-a692-479e-a4d1-e8960e7e4e6f-kube-api-access-n7tgz\") pod \"auto-csr-approver-29555598-xczd8\" (UID: \"c1a51aa6-a692-479e-a4d1-e8960e7e4e6f\") " pod="openshift-infra/auto-csr-approver-29555598-xczd8" Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.422270 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7tgz\" (UniqueName: \"kubernetes.io/projected/c1a51aa6-a692-479e-a4d1-e8960e7e4e6f-kube-api-access-n7tgz\") pod \"auto-csr-approver-29555598-xczd8\" (UID: \"c1a51aa6-a692-479e-a4d1-e8960e7e4e6f\") " pod="openshift-infra/auto-csr-approver-29555598-xczd8" Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.500124 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555598-xczd8" Mar 12 17:18:00 crc kubenswrapper[5184]: I0312 17:18:00.984004 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555598-xczd8"] Mar 12 17:18:01 crc kubenswrapper[5184]: I0312 17:18:01.968508 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555598-xczd8" event={"ID":"c1a51aa6-a692-479e-a4d1-e8960e7e4e6f","Type":"ContainerStarted","Data":"f79a5e3c17b27017e2d6f6b65025abdfdb191a65df81a1e4860afb81816b52eb"} Mar 12 17:18:02 crc kubenswrapper[5184]: I0312 17:18:02.983073 5184 generic.go:358] "Generic (PLEG): container finished" podID="c1a51aa6-a692-479e-a4d1-e8960e7e4e6f" containerID="e8453f3e0c2fb9cd3fbc92886ac68cbe81f26b8b981a10c8204a8dbb8e26308c" exitCode=0 Mar 12 17:18:02 crc kubenswrapper[5184]: I0312 17:18:02.983169 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555598-xczd8" event={"ID":"c1a51aa6-a692-479e-a4d1-e8960e7e4e6f","Type":"ContainerDied","Data":"e8453f3e0c2fb9cd3fbc92886ac68cbe81f26b8b981a10c8204a8dbb8e26308c"} Mar 12 17:18:04 crc kubenswrapper[5184]: I0312 17:18:04.397098 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555598-xczd8" Mar 12 17:18:04 crc kubenswrapper[5184]: I0312 17:18:04.471274 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7tgz\" (UniqueName: \"kubernetes.io/projected/c1a51aa6-a692-479e-a4d1-e8960e7e4e6f-kube-api-access-n7tgz\") pod \"c1a51aa6-a692-479e-a4d1-e8960e7e4e6f\" (UID: \"c1a51aa6-a692-479e-a4d1-e8960e7e4e6f\") " Mar 12 17:18:04 crc kubenswrapper[5184]: I0312 17:18:04.481058 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a51aa6-a692-479e-a4d1-e8960e7e4e6f-kube-api-access-n7tgz" (OuterVolumeSpecName: "kube-api-access-n7tgz") pod "c1a51aa6-a692-479e-a4d1-e8960e7e4e6f" (UID: "c1a51aa6-a692-479e-a4d1-e8960e7e4e6f"). InnerVolumeSpecName "kube-api-access-n7tgz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:18:04 crc kubenswrapper[5184]: I0312 17:18:04.574287 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7tgz\" (UniqueName: \"kubernetes.io/projected/c1a51aa6-a692-479e-a4d1-e8960e7e4e6f-kube-api-access-n7tgz\") on node \"crc\" DevicePath \"\"" Mar 12 17:18:05 crc kubenswrapper[5184]: I0312 17:18:05.016659 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555598-xczd8" event={"ID":"c1a51aa6-a692-479e-a4d1-e8960e7e4e6f","Type":"ContainerDied","Data":"f79a5e3c17b27017e2d6f6b65025abdfdb191a65df81a1e4860afb81816b52eb"} Mar 12 17:18:05 crc kubenswrapper[5184]: I0312 17:18:05.016718 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f79a5e3c17b27017e2d6f6b65025abdfdb191a65df81a1e4860afb81816b52eb" Mar 12 17:18:05 crc kubenswrapper[5184]: I0312 17:18:05.016677 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555598-xczd8" Mar 12 17:18:05 crc kubenswrapper[5184]: I0312 17:18:05.067320 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7g4dn"] Mar 12 17:18:05 crc kubenswrapper[5184]: I0312 17:18:05.081186 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xfj2g"] Mar 12 17:18:05 crc kubenswrapper[5184]: I0312 17:18:05.089359 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7g4dn"] Mar 12 17:18:05 crc kubenswrapper[5184]: I0312 17:18:05.097913 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xfj2g"] Mar 12 17:18:05 crc kubenswrapper[5184]: I0312 17:18:05.469495 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555592-s9qm5"] Mar 12 17:18:05 crc kubenswrapper[5184]: I0312 17:18:05.480044 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555592-s9qm5"] Mar 12 17:18:06 crc kubenswrapper[5184]: I0312 17:18:06.417349 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="464abcc7-9e49-4fff-8dbd-c4ce18f54bb8" path="/var/lib/kubelet/pods/464abcc7-9e49-4fff-8dbd-c4ce18f54bb8/volumes" Mar 12 17:18:06 crc kubenswrapper[5184]: I0312 17:18:06.418677 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fdadd77-2670-41df-b20b-57b771031dde" path="/var/lib/kubelet/pods/4fdadd77-2670-41df-b20b-57b771031dde/volumes" Mar 12 17:18:06 crc kubenswrapper[5184]: I0312 17:18:06.419351 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86dddeb4-4bdf-4457-ae72-4e42fe713b7d" path="/var/lib/kubelet/pods/86dddeb4-4bdf-4457-ae72-4e42fe713b7d/volumes" Mar 12 17:18:09 crc kubenswrapper[5184]: I0312 17:18:09.040261 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2mwtn"] Mar 12 17:18:09 crc kubenswrapper[5184]: I0312 17:18:09.051281 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bb83-account-create-update-szjsd"] Mar 12 17:18:09 crc kubenswrapper[5184]: I0312 17:18:09.068697 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-66a8-account-create-update-8jj7v"] Mar 12 17:18:09 crc kubenswrapper[5184]: I0312 17:18:09.079128 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-571d-account-create-update-n2749"] Mar 12 17:18:09 crc kubenswrapper[5184]: I0312 17:18:09.086766 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2mwtn"] Mar 12 17:18:09 crc kubenswrapper[5184]: I0312 17:18:09.093411 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bb83-account-create-update-szjsd"] Mar 12 17:18:09 crc kubenswrapper[5184]: I0312 17:18:09.100401 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-66a8-account-create-update-8jj7v"] Mar 12 17:18:09 crc kubenswrapper[5184]: I0312 17:18:09.106572 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-571d-account-create-update-n2749"] Mar 12 17:18:10 crc kubenswrapper[5184]: I0312 17:18:10.036937 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7fxbs"] Mar 12 17:18:10 crc kubenswrapper[5184]: I0312 17:18:10.051066 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7fxbs"] Mar 12 17:18:10 crc kubenswrapper[5184]: I0312 17:18:10.417343 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e50fc1-0b61-44f4-922a-acb08efb0796" path="/var/lib/kubelet/pods/21e50fc1-0b61-44f4-922a-acb08efb0796/volumes" Mar 12 17:18:10 crc kubenswrapper[5184]: I0312 17:18:10.419225 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa5400c-5bc3-4cd5-849d-87105da3827b" path="/var/lib/kubelet/pods/2aa5400c-5bc3-4cd5-849d-87105da3827b/volumes" Mar 12 17:18:10 crc kubenswrapper[5184]: I0312 17:18:10.420992 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f4b74dc-78a2-4b8c-8b52-cb972e894961" path="/var/lib/kubelet/pods/2f4b74dc-78a2-4b8c-8b52-cb972e894961/volumes" Mar 12 17:18:10 crc kubenswrapper[5184]: I0312 17:18:10.422634 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec5e94e-bd18-444a-9340-de9b41934458" path="/var/lib/kubelet/pods/5ec5e94e-bd18-444a-9340-de9b41934458/volumes" Mar 12 17:18:10 crc kubenswrapper[5184]: I0312 17:18:10.425809 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a595e1-ca3b-4932-b9b6-c1c0a237a783" path="/var/lib/kubelet/pods/69a595e1-ca3b-4932-b9b6-c1c0a237a783/volumes" Mar 12 17:18:13 crc kubenswrapper[5184]: I0312 17:18:13.037206 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4nkwm"] Mar 12 17:18:13 crc kubenswrapper[5184]: I0312 17:18:13.045243 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4nkwm"] Mar 12 17:18:14 crc kubenswrapper[5184]: I0312 17:18:14.413813 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64b215c-973c-4761-9b13-0510387973ee" path="/var/lib/kubelet/pods/f64b215c-973c-4761-9b13-0510387973ee/volumes" Mar 12 17:18:18 crc kubenswrapper[5184]: I0312 17:18:18.730665 5184 scope.go:117] "RemoveContainer" containerID="cbaa56a359b46b638e5969f84cd63453e5da865d262bd0a4bdb74e47936991a9" Mar 12 17:18:18 crc kubenswrapper[5184]: I0312 17:18:18.780893 5184 scope.go:117] "RemoveContainer" containerID="592c57124a6c246a80513dfee9774043582d86990464af951cfc7d318b698869" Mar 12 17:18:18 crc kubenswrapper[5184]: I0312 17:18:18.813320 5184 scope.go:117] "RemoveContainer" containerID="a3c4cea2a60bc9ef16549c18cb1981decd4a633a723252ed8e1e69562d803b6c" Mar 12 17:18:18 crc kubenswrapper[5184]: I0312 17:18:18.867165 5184 scope.go:117] "RemoveContainer" containerID="2435994606af1d1b9f68d10111012a0a05fd320a41cde861c73a1f0787efa241" Mar 12 17:18:18 crc kubenswrapper[5184]: I0312 17:18:18.912265 5184 scope.go:117] "RemoveContainer" containerID="e444ed8a4cf30be194f3f9e7a76611008eb92112aae4c93388bf7df409a63da2" Mar 12 17:18:18 crc kubenswrapper[5184]: I0312 17:18:18.966902 5184 scope.go:117] "RemoveContainer" containerID="43a91d1c28e956ebf8ca465cfcb74f879b9307b5b49d2038b37c88be1b1f3488" Mar 12 17:18:19 crc kubenswrapper[5184]: I0312 17:18:19.001583 5184 scope.go:117] "RemoveContainer" containerID="a57617551bbfe3d89bcb379f1587ec00ad6d87f2b8ea43063d9858e615e2d4e8" Mar 12 17:18:19 crc kubenswrapper[5184]: I0312 17:18:19.060465 5184 scope.go:117] "RemoveContainer" containerID="7f3ba97631c02718650f117fbe9363093763eab313e7e0abec7245b2f9967333" Mar 12 17:18:19 crc kubenswrapper[5184]: I0312 17:18:19.080099 5184 scope.go:117] "RemoveContainer" containerID="f9dc3723272109ce6bcc336607537943677053bf8a8bf3eea7f134b8376919f0" Mar 12 17:18:19 crc kubenswrapper[5184]: I0312 17:18:19.098347 5184 scope.go:117] "RemoveContainer" containerID="7947b4579198bd3277c8220ef4c1fdd253647feb0f12ccb500e53f81aa05d8dc" Mar 12 17:18:19 crc kubenswrapper[5184]: I0312 17:18:19.125267 5184 scope.go:117] "RemoveContainer" containerID="a014aedd868115a04cf195d20b020078b429ad18e7003faa31fa4edafed7d146" Mar 12 17:18:19 crc kubenswrapper[5184]: I0312 17:18:19.149397 5184 scope.go:117] "RemoveContainer" containerID="3ee33983972038dd5e00575a6b54a7c0d08ba50e2c5488fb40a85a71c2bb8087" Mar 12 17:18:19 crc kubenswrapper[5184]: I0312 17:18:19.227515 5184 scope.go:117] "RemoveContainer" containerID="4c88d94c3fe4a0721c9dcd52752ba963d089a7f6e91dc5443cb1da06f3d62a22" Mar 12 17:18:19 crc kubenswrapper[5184]: I0312 17:18:19.247414 5184 scope.go:117] "RemoveContainer" containerID="190b64d8e31efaaf582c749433b417fc51e6109d2f8b14bb5db31ef71632d0c1" Mar 12 17:18:19 crc kubenswrapper[5184]: I0312 17:18:19.271362 5184 scope.go:117] "RemoveContainer" containerID="85425b87026fec016d23203e231b782bbf5d697c2deed2097fea43fa3778e354" Mar 12 17:18:19 crc kubenswrapper[5184]: I0312 17:18:19.292520 5184 scope.go:117] "RemoveContainer" containerID="7365c08678ba86031cf99befe675647320ef5f47affc3d7942e925548c0e2bfd" Mar 12 17:18:23 crc kubenswrapper[5184]: I0312 17:18:23.248634 5184 generic.go:358] "Generic (PLEG): container finished" podID="d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f" containerID="c9e577cce40d39b90d16b69cdbf1b5013d56125cd7e1643855e398608090458d" exitCode=0 Mar 12 17:18:23 crc kubenswrapper[5184]: I0312 17:18:23.248728 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" event={"ID":"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f","Type":"ContainerDied","Data":"c9e577cce40d39b90d16b69cdbf1b5013d56125cd7e1643855e398608090458d"} Mar 12 17:18:24 crc kubenswrapper[5184]: I0312 17:18:24.683734 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:18:24 crc kubenswrapper[5184]: I0312 17:18:24.717978 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnk9f\" (UniqueName: \"kubernetes.io/projected/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-kube-api-access-qnk9f\") pod \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " Mar 12 17:18:24 crc kubenswrapper[5184]: I0312 17:18:24.718091 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-ssh-key-openstack-edpm-ipam\") pod \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " Mar 12 17:18:24 crc kubenswrapper[5184]: I0312 17:18:24.718303 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-inventory\") pod \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\" (UID: \"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f\") " Mar 12 17:18:24 crc kubenswrapper[5184]: I0312 17:18:24.732035 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-kube-api-access-qnk9f" (OuterVolumeSpecName: "kube-api-access-qnk9f") pod "d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f" (UID: "d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f"). InnerVolumeSpecName "kube-api-access-qnk9f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:18:24 crc kubenswrapper[5184]: I0312 17:18:24.753522 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f" (UID: "d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:18:24 crc kubenswrapper[5184]: I0312 17:18:24.753583 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-inventory" (OuterVolumeSpecName: "inventory") pod "d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f" (UID: "d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:18:24 crc kubenswrapper[5184]: I0312 17:18:24.821467 5184 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 17:18:24 crc kubenswrapper[5184]: I0312 17:18:24.821535 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qnk9f\" (UniqueName: \"kubernetes.io/projected/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-kube-api-access-qnk9f\") on node \"crc\" DevicePath \"\"" Mar 12 17:18:24 crc kubenswrapper[5184]: I0312 17:18:24.821552 5184 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.282682 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" event={"ID":"d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f","Type":"ContainerDied","Data":"77836d6ed2b421ef5ec2239e2af4643d9cc149f9ac53f4bdb18826df9d61e5f9"} Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.283296 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77836d6ed2b421ef5ec2239e2af4643d9cc149f9ac53f4bdb18826df9d61e5f9" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.282786 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rlmmf" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.412667 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr"] Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.414432 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.414560 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.414730 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1a51aa6-a692-479e-a4d1-e8960e7e4e6f" containerName="oc" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.414840 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a51aa6-a692-479e-a4d1-e8960e7e4e6f" containerName="oc" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.415189 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.415309 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1a51aa6-a692-479e-a4d1-e8960e7e4e6f" containerName="oc" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.422841 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.429418 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr"] Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.433597 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\"" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.433637 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.433817 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\"" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.434109 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-qr8nl\"" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.537527 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.537780 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m2g5\" (UniqueName: \"kubernetes.io/projected/af3748c5-3b07-43f8-a444-fb48032538b0-kube-api-access-8m2g5\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.537925 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.640748 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8m2g5\" (UniqueName: \"kubernetes.io/projected/af3748c5-3b07-43f8-a444-fb48032538b0-kube-api-access-8m2g5\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.640953 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.641047 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.646318 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.650165 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.661005 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m2g5\" (UniqueName: \"kubernetes.io/projected/af3748c5-3b07-43f8-a444-fb48032538b0-kube-api-access-8m2g5\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:25 crc kubenswrapper[5184]: I0312 17:18:25.761261 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:26 crc kubenswrapper[5184]: I0312 17:18:26.368782 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr"] Mar 12 17:18:27 crc kubenswrapper[5184]: I0312 17:18:27.306168 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" event={"ID":"af3748c5-3b07-43f8-a444-fb48032538b0","Type":"ContainerStarted","Data":"4a8ecfbaa4a6bb7adc74b2c985950818c1805fdce64a7d9d5037351a064a5f8b"} Mar 12 17:18:27 crc kubenswrapper[5184]: I0312 17:18:27.307169 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" event={"ID":"af3748c5-3b07-43f8-a444-fb48032538b0","Type":"ContainerStarted","Data":"1a594c785de94640b6b8b28cc6ebd22f4ca64bf6f0ef205d6388cd6e1c5a6761"} Mar 12 17:18:27 crc kubenswrapper[5184]: I0312 17:18:27.338710 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" podStartSLOduration=1.812040316 podStartE2EDuration="2.338623731s" podCreationTimestamp="2026-03-12 17:18:25 +0000 UTC" firstStartedPulling="2026-03-12 17:18:26.374884448 +0000 UTC m=+1648.916195797" lastFinishedPulling="2026-03-12 17:18:26.901467853 +0000 UTC m=+1649.442779212" observedRunningTime="2026-03-12 17:18:27.331614364 +0000 UTC m=+1649.872925703" watchObservedRunningTime="2026-03-12 17:18:27.338623731 +0000 UTC m=+1649.879935130" Mar 12 17:18:31 crc kubenswrapper[5184]: I0312 17:18:31.351927 5184 generic.go:358] "Generic (PLEG): container finished" podID="af3748c5-3b07-43f8-a444-fb48032538b0" containerID="4a8ecfbaa4a6bb7adc74b2c985950818c1805fdce64a7d9d5037351a064a5f8b" exitCode=0 Mar 12 17:18:31 crc kubenswrapper[5184]: I0312 17:18:31.352026 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" event={"ID":"af3748c5-3b07-43f8-a444-fb48032538b0","Type":"ContainerDied","Data":"4a8ecfbaa4a6bb7adc74b2c985950818c1805fdce64a7d9d5037351a064a5f8b"} Mar 12 17:18:32 crc kubenswrapper[5184]: I0312 17:18:32.907679 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.001133 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-inventory\") pod \"af3748c5-3b07-43f8-a444-fb48032538b0\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.001299 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-ssh-key-openstack-edpm-ipam\") pod \"af3748c5-3b07-43f8-a444-fb48032538b0\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.001508 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m2g5\" (UniqueName: \"kubernetes.io/projected/af3748c5-3b07-43f8-a444-fb48032538b0-kube-api-access-8m2g5\") pod \"af3748c5-3b07-43f8-a444-fb48032538b0\" (UID: \"af3748c5-3b07-43f8-a444-fb48032538b0\") " Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.008588 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3748c5-3b07-43f8-a444-fb48032538b0-kube-api-access-8m2g5" (OuterVolumeSpecName: "kube-api-access-8m2g5") pod "af3748c5-3b07-43f8-a444-fb48032538b0" (UID: "af3748c5-3b07-43f8-a444-fb48032538b0"). InnerVolumeSpecName "kube-api-access-8m2g5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.047693 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "af3748c5-3b07-43f8-a444-fb48032538b0" (UID: "af3748c5-3b07-43f8-a444-fb48032538b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.058326 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-inventory" (OuterVolumeSpecName: "inventory") pod "af3748c5-3b07-43f8-a444-fb48032538b0" (UID: "af3748c5-3b07-43f8-a444-fb48032538b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.103926 5184 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.103969 5184 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af3748c5-3b07-43f8-a444-fb48032538b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.103985 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8m2g5\" (UniqueName: \"kubernetes.io/projected/af3748c5-3b07-43f8-a444-fb48032538b0-kube-api-access-8m2g5\") on node \"crc\" DevicePath \"\"" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.378946 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.379111 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr" event={"ID":"af3748c5-3b07-43f8-a444-fb48032538b0","Type":"ContainerDied","Data":"1a594c785de94640b6b8b28cc6ebd22f4ca64bf6f0ef205d6388cd6e1c5a6761"} Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.379171 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a594c785de94640b6b8b28cc6ebd22f4ca64bf6f0ef205d6388cd6e1c5a6761" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.465936 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt"] Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.466970 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af3748c5-3b07-43f8-a444-fb48032538b0" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.466990 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3748c5-3b07-43f8-a444-fb48032538b0" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.467213 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="af3748c5-3b07-43f8-a444-fb48032538b0" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.474320 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.477344 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\"" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.477368 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-qr8nl\"" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.478530 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.478742 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\"" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.479355 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt"] Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.614103 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.614226 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt76c\" (UniqueName: \"kubernetes.io/projected/288f0750-6757-4078-b6b4-5283c4b54d41-kube-api-access-jt76c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.614310 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.716477 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.716560 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.716640 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jt76c\" (UniqueName: \"kubernetes.io/projected/288f0750-6757-4078-b6b4-5283c4b54d41-kube-api-access-jt76c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.720144 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.721402 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.738109 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt76c\" (UniqueName: \"kubernetes.io/projected/288f0750-6757-4078-b6b4-5283c4b54d41-kube-api-access-jt76c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:33 crc kubenswrapper[5184]: I0312 17:18:33.828695 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:18:34 crc kubenswrapper[5184]: I0312 17:18:34.416409 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt"] Mar 12 17:18:35 crc kubenswrapper[5184]: I0312 17:18:35.403084 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" event={"ID":"288f0750-6757-4078-b6b4-5283c4b54d41","Type":"ContainerStarted","Data":"4fe4c71effac8b36bf12610ae78c150d7ed14bcd984524b82cf81030f1788394"} Mar 12 17:18:35 crc kubenswrapper[5184]: I0312 17:18:35.403444 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" event={"ID":"288f0750-6757-4078-b6b4-5283c4b54d41","Type":"ContainerStarted","Data":"0d6d788ea3283d659e469f63b6dce0fac12d503a873e5a3f5fa8284d3226719f"} Mar 12 17:18:35 crc kubenswrapper[5184]: I0312 17:18:35.425751 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" podStartSLOduration=2.023568765 podStartE2EDuration="2.425733086s" podCreationTimestamp="2026-03-12 17:18:33 +0000 UTC" firstStartedPulling="2026-03-12 17:18:34.419495682 +0000 UTC m=+1656.960807031" lastFinishedPulling="2026-03-12 17:18:34.821659973 +0000 UTC m=+1657.362971352" observedRunningTime="2026-03-12 17:18:35.420467332 +0000 UTC m=+1657.961778671" watchObservedRunningTime="2026-03-12 17:18:35.425733086 +0000 UTC m=+1657.967044425" Mar 12 17:18:45 crc kubenswrapper[5184]: I0312 17:18:45.067154 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vtzf9"] Mar 12 17:18:45 crc kubenswrapper[5184]: I0312 17:18:45.079994 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vtzf9"] Mar 12 17:18:46 crc kubenswrapper[5184]: I0312 17:18:46.417847 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59bc23c2-fe37-43e7-a1a9-2830892902bf" path="/var/lib/kubelet/pods/59bc23c2-fe37-43e7-a1a9-2830892902bf/volumes" Mar 12 17:18:49 crc kubenswrapper[5184]: E0312 17:18:49.159754 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1792756 actualBytes=10240 Mar 12 17:18:53 crc kubenswrapper[5184]: I0312 17:18:53.047659 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mvj44"] Mar 12 17:18:53 crc kubenswrapper[5184]: I0312 17:18:53.063079 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mvj44"] Mar 12 17:18:54 crc kubenswrapper[5184]: I0312 17:18:54.416329 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9bd8488-49bb-48df-8f41-f415f71a2834" path="/var/lib/kubelet/pods/a9bd8488-49bb-48df-8f41-f415f71a2834/volumes" Mar 12 17:19:00 crc kubenswrapper[5184]: I0312 17:19:00.061909 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9n4jh"] Mar 12 17:19:00 crc kubenswrapper[5184]: I0312 17:19:00.071702 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9n4jh"] Mar 12 17:19:00 crc kubenswrapper[5184]: I0312 17:19:00.414160 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d8d368-22c0-41a1-972c-d8f7c14db7b5" path="/var/lib/kubelet/pods/c7d8d368-22c0-41a1-972c-d8f7c14db7b5/volumes" Mar 12 17:19:02 crc kubenswrapper[5184]: I0312 17:19:02.039145 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-jb66b"] Mar 12 17:19:02 crc kubenswrapper[5184]: I0312 17:19:02.050479 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-jb66b"] Mar 12 17:19:02 crc kubenswrapper[5184]: I0312 17:19:02.419363 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c" path="/var/lib/kubelet/pods/d1b1f1ba-a8e3-4079-9319-5c1daeb3cc5c/volumes" Mar 12 17:19:12 crc kubenswrapper[5184]: I0312 17:19:12.053387 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-gzjgn"] Mar 12 17:19:12 crc kubenswrapper[5184]: I0312 17:19:12.064077 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-gzjgn"] Mar 12 17:19:12 crc kubenswrapper[5184]: I0312 17:19:12.416453 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed3fdf4-b869-4ef8-b746-afe1e52fe286" path="/var/lib/kubelet/pods/7ed3fdf4-b869-4ef8-b746-afe1e52fe286/volumes" Mar 12 17:19:19 crc kubenswrapper[5184]: I0312 17:19:19.634126 5184 scope.go:117] "RemoveContainer" containerID="a4f5d3d4c700f2d9a238c75ab41843bc64c0d8f971ab4ddd7ceced4d12313fac" Mar 12 17:19:19 crc kubenswrapper[5184]: I0312 17:19:19.668538 5184 scope.go:117] "RemoveContainer" containerID="c54692a79c5db321094440b5acb8b5c54fd9f4751d54741eb605e33289007cca" Mar 12 17:19:19 crc kubenswrapper[5184]: I0312 17:19:19.701152 5184 scope.go:117] "RemoveContainer" containerID="3c8cc2178ecb3ec4d8690b45d2bc35f21e9f8b1578a6c5175cf7f61befac352a" Mar 12 17:19:19 crc kubenswrapper[5184]: I0312 17:19:19.777111 5184 scope.go:117] "RemoveContainer" containerID="45049197c84547cb9b08115ad2142cab8b478837d30511a82dba357efd5b8ef0" Mar 12 17:19:19 crc kubenswrapper[5184]: I0312 17:19:19.844672 5184 scope.go:117] "RemoveContainer" containerID="27fa142bf68698e87c9b60da45f68b7ca5810536c39991c80fd7de3a9b2f6aab" Mar 12 17:19:19 crc kubenswrapper[5184]: I0312 17:19:19.926712 5184 generic.go:358] "Generic (PLEG): container finished" podID="288f0750-6757-4078-b6b4-5283c4b54d41" containerID="4fe4c71effac8b36bf12610ae78c150d7ed14bcd984524b82cf81030f1788394" exitCode=0 Mar 12 17:19:19 crc kubenswrapper[5184]: I0312 17:19:19.926803 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" event={"ID":"288f0750-6757-4078-b6b4-5283c4b54d41","Type":"ContainerDied","Data":"4fe4c71effac8b36bf12610ae78c150d7ed14bcd984524b82cf81030f1788394"} Mar 12 17:19:20 crc kubenswrapper[5184]: I0312 17:19:20.742620 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:19:20 crc kubenswrapper[5184]: I0312 17:19:20.743760 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.440149 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.542136 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-ssh-key-openstack-edpm-ipam\") pod \"288f0750-6757-4078-b6b4-5283c4b54d41\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.542226 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-inventory\") pod \"288f0750-6757-4078-b6b4-5283c4b54d41\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.542334 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt76c\" (UniqueName: \"kubernetes.io/projected/288f0750-6757-4078-b6b4-5283c4b54d41-kube-api-access-jt76c\") pod \"288f0750-6757-4078-b6b4-5283c4b54d41\" (UID: \"288f0750-6757-4078-b6b4-5283c4b54d41\") " Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.553629 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/288f0750-6757-4078-b6b4-5283c4b54d41-kube-api-access-jt76c" (OuterVolumeSpecName: "kube-api-access-jt76c") pod "288f0750-6757-4078-b6b4-5283c4b54d41" (UID: "288f0750-6757-4078-b6b4-5283c4b54d41"). InnerVolumeSpecName "kube-api-access-jt76c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.570078 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-inventory" (OuterVolumeSpecName: "inventory") pod "288f0750-6757-4078-b6b4-5283c4b54d41" (UID: "288f0750-6757-4078-b6b4-5283c4b54d41"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.588243 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "288f0750-6757-4078-b6b4-5283c4b54d41" (UID: "288f0750-6757-4078-b6b4-5283c4b54d41"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.646229 5184 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.646274 5184 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/288f0750-6757-4078-b6b4-5283c4b54d41-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.646287 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jt76c\" (UniqueName: \"kubernetes.io/projected/288f0750-6757-4078-b6b4-5283c4b54d41-kube-api-access-jt76c\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.955932 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.955991 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt" event={"ID":"288f0750-6757-4078-b6b4-5283c4b54d41","Type":"ContainerDied","Data":"0d6d788ea3283d659e469f63b6dce0fac12d503a873e5a3f5fa8284d3226719f"} Mar 12 17:19:21 crc kubenswrapper[5184]: I0312 17:19:21.956053 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d6d788ea3283d659e469f63b6dce0fac12d503a873e5a3f5fa8284d3226719f" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.055033 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xbm7s"] Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.056657 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="288f0750-6757-4078-b6b4-5283c4b54d41" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.056687 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="288f0750-6757-4078-b6b4-5283c4b54d41" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.057052 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="288f0750-6757-4078-b6b4-5283c4b54d41" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.066931 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.076166 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\"" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.076359 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-qr8nl\"" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.076644 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.076888 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\"" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.086947 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xbm7s"] Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.155956 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzhlm\" (UniqueName: \"kubernetes.io/projected/3fc025dd-36df-47a5-9728-667e76180934-kube-api-access-bzhlm\") pod \"ssh-known-hosts-edpm-deployment-xbm7s\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.156020 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xbm7s\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.156116 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xbm7s\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.257975 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzhlm\" (UniqueName: \"kubernetes.io/projected/3fc025dd-36df-47a5-9728-667e76180934-kube-api-access-bzhlm\") pod \"ssh-known-hosts-edpm-deployment-xbm7s\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.258078 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xbm7s\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.258175 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xbm7s\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.265885 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xbm7s\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.266300 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xbm7s\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.295784 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzhlm\" (UniqueName: \"kubernetes.io/projected/3fc025dd-36df-47a5-9728-667e76180934-kube-api-access-bzhlm\") pod \"ssh-known-hosts-edpm-deployment-xbm7s\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.402310 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.755371 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xbm7s"] Mar 12 17:19:22 crc kubenswrapper[5184]: I0312 17:19:22.968752 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" event={"ID":"3fc025dd-36df-47a5-9728-667e76180934","Type":"ContainerStarted","Data":"1350eaa4759dec59a5bbc36bc0dc1f7e409c6be56ead9bd80b0cf4fb92cf5d0c"} Mar 12 17:19:23 crc kubenswrapper[5184]: I0312 17:19:23.985901 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" event={"ID":"3fc025dd-36df-47a5-9728-667e76180934","Type":"ContainerStarted","Data":"9e20346531f894d4e2afcb6ab4af6d9f7893752b678f5820809bf44e0866ffce"} Mar 12 17:19:24 crc kubenswrapper[5184]: I0312 17:19:24.013069 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" podStartSLOduration=1.615539681 podStartE2EDuration="2.013050987s" podCreationTimestamp="2026-03-12 17:19:22 +0000 UTC" firstStartedPulling="2026-03-12 17:19:22.761587406 +0000 UTC m=+1705.302898745" lastFinishedPulling="2026-03-12 17:19:23.159098682 +0000 UTC m=+1705.700410051" observedRunningTime="2026-03-12 17:19:24.012063107 +0000 UTC m=+1706.553374446" watchObservedRunningTime="2026-03-12 17:19:24.013050987 +0000 UTC m=+1706.554362326" Mar 12 17:19:30 crc kubenswrapper[5184]: I0312 17:19:30.060476 5184 generic.go:358] "Generic (PLEG): container finished" podID="3fc025dd-36df-47a5-9728-667e76180934" containerID="9e20346531f894d4e2afcb6ab4af6d9f7893752b678f5820809bf44e0866ffce" exitCode=0 Mar 12 17:19:30 crc kubenswrapper[5184]: I0312 17:19:30.060608 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" event={"ID":"3fc025dd-36df-47a5-9728-667e76180934","Type":"ContainerDied","Data":"9e20346531f894d4e2afcb6ab4af6d9f7893752b678f5820809bf44e0866ffce"} Mar 12 17:19:31 crc kubenswrapper[5184]: I0312 17:19:31.564122 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:31 crc kubenswrapper[5184]: I0312 17:19:31.672646 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-inventory-0\") pod \"3fc025dd-36df-47a5-9728-667e76180934\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " Mar 12 17:19:31 crc kubenswrapper[5184]: I0312 17:19:31.672756 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-ssh-key-openstack-edpm-ipam\") pod \"3fc025dd-36df-47a5-9728-667e76180934\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " Mar 12 17:19:31 crc kubenswrapper[5184]: I0312 17:19:31.672856 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzhlm\" (UniqueName: \"kubernetes.io/projected/3fc025dd-36df-47a5-9728-667e76180934-kube-api-access-bzhlm\") pod \"3fc025dd-36df-47a5-9728-667e76180934\" (UID: \"3fc025dd-36df-47a5-9728-667e76180934\") " Mar 12 17:19:31 crc kubenswrapper[5184]: I0312 17:19:31.678037 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc025dd-36df-47a5-9728-667e76180934-kube-api-access-bzhlm" (OuterVolumeSpecName: "kube-api-access-bzhlm") pod "3fc025dd-36df-47a5-9728-667e76180934" (UID: "3fc025dd-36df-47a5-9728-667e76180934"). InnerVolumeSpecName "kube-api-access-bzhlm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:19:31 crc kubenswrapper[5184]: I0312 17:19:31.699184 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3fc025dd-36df-47a5-9728-667e76180934" (UID: "3fc025dd-36df-47a5-9728-667e76180934"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:19:31 crc kubenswrapper[5184]: I0312 17:19:31.712687 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3fc025dd-36df-47a5-9728-667e76180934" (UID: "3fc025dd-36df-47a5-9728-667e76180934"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:19:31 crc kubenswrapper[5184]: I0312 17:19:31.774717 5184 reconciler_common.go:299] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:31 crc kubenswrapper[5184]: I0312 17:19:31.774771 5184 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fc025dd-36df-47a5-9728-667e76180934-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:31 crc kubenswrapper[5184]: I0312 17:19:31.774781 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bzhlm\" (UniqueName: \"kubernetes.io/projected/3fc025dd-36df-47a5-9728-667e76180934-kube-api-access-bzhlm\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.085080 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.085644 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xbm7s" event={"ID":"3fc025dd-36df-47a5-9728-667e76180934","Type":"ContainerDied","Data":"1350eaa4759dec59a5bbc36bc0dc1f7e409c6be56ead9bd80b0cf4fb92cf5d0c"} Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.085850 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1350eaa4759dec59a5bbc36bc0dc1f7e409c6be56ead9bd80b0cf4fb92cf5d0c" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.161366 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc"] Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.163678 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fc025dd-36df-47a5-9728-667e76180934" containerName="ssh-known-hosts-edpm-deployment" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.163849 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc025dd-36df-47a5-9728-667e76180934" containerName="ssh-known-hosts-edpm-deployment" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.164269 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fc025dd-36df-47a5-9728-667e76180934" containerName="ssh-known-hosts-edpm-deployment" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.172191 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.176325 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc"] Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.177762 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.177827 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-qr8nl\"" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.177855 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\"" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.179118 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\"" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.285341 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4x7w\" (UniqueName: \"kubernetes.io/projected/cfe56c8f-5774-4342-8770-c872afb09c60-kube-api-access-w4x7w\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xxcc\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.285652 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xxcc\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.285857 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xxcc\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.388494 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xxcc\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.388737 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xxcc\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.388834 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4x7w\" (UniqueName: \"kubernetes.io/projected/cfe56c8f-5774-4342-8770-c872afb09c60-kube-api-access-w4x7w\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xxcc\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.394297 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xxcc\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.394484 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xxcc\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.412680 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4x7w\" (UniqueName: \"kubernetes.io/projected/cfe56c8f-5774-4342-8770-c872afb09c60-kube-api-access-w4x7w\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xxcc\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:32 crc kubenswrapper[5184]: I0312 17:19:32.501249 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:33 crc kubenswrapper[5184]: I0312 17:19:33.116131 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc"] Mar 12 17:19:33 crc kubenswrapper[5184]: W0312 17:19:33.123567 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfe56c8f_5774_4342_8770_c872afb09c60.slice/crio-8f5a789e7ce1c208b739b531676321166ed038c1ef6ae11a5b25a206daeb98b0 WatchSource:0}: Error finding container 8f5a789e7ce1c208b739b531676321166ed038c1ef6ae11a5b25a206daeb98b0: Status 404 returned error can't find the container with id 8f5a789e7ce1c208b739b531676321166ed038c1ef6ae11a5b25a206daeb98b0 Mar 12 17:19:34 crc kubenswrapper[5184]: I0312 17:19:34.109306 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" event={"ID":"cfe56c8f-5774-4342-8770-c872afb09c60","Type":"ContainerStarted","Data":"51d07890ed1e815b5781924678b666d5a1b887ec9468b7c83d21854da94cb1b9"} Mar 12 17:19:34 crc kubenswrapper[5184]: I0312 17:19:34.110052 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" event={"ID":"cfe56c8f-5774-4342-8770-c872afb09c60","Type":"ContainerStarted","Data":"8f5a789e7ce1c208b739b531676321166ed038c1ef6ae11a5b25a206daeb98b0"} Mar 12 17:19:34 crc kubenswrapper[5184]: I0312 17:19:34.138529 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" podStartSLOduration=1.477693498 podStartE2EDuration="2.138508545s" podCreationTimestamp="2026-03-12 17:19:32 +0000 UTC" firstStartedPulling="2026-03-12 17:19:33.124841641 +0000 UTC m=+1715.666152980" lastFinishedPulling="2026-03-12 17:19:33.785656668 +0000 UTC m=+1716.326968027" observedRunningTime="2026-03-12 17:19:34.129868686 +0000 UTC m=+1716.671180085" watchObservedRunningTime="2026-03-12 17:19:34.138508545 +0000 UTC m=+1716.679819884" Mar 12 17:19:42 crc kubenswrapper[5184]: I0312 17:19:42.218467 5184 generic.go:358] "Generic (PLEG): container finished" podID="cfe56c8f-5774-4342-8770-c872afb09c60" containerID="51d07890ed1e815b5781924678b666d5a1b887ec9468b7c83d21854da94cb1b9" exitCode=0 Mar 12 17:19:42 crc kubenswrapper[5184]: I0312 17:19:42.218589 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" event={"ID":"cfe56c8f-5774-4342-8770-c872afb09c60","Type":"ContainerDied","Data":"51d07890ed1e815b5781924678b666d5a1b887ec9468b7c83d21854da94cb1b9"} Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.716977 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.901560 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-inventory\") pod \"cfe56c8f-5774-4342-8770-c872afb09c60\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.901793 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4x7w\" (UniqueName: \"kubernetes.io/projected/cfe56c8f-5774-4342-8770-c872afb09c60-kube-api-access-w4x7w\") pod \"cfe56c8f-5774-4342-8770-c872afb09c60\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.901940 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-ssh-key-openstack-edpm-ipam\") pod \"cfe56c8f-5774-4342-8770-c872afb09c60\" (UID: \"cfe56c8f-5774-4342-8770-c872afb09c60\") " Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.909219 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe56c8f-5774-4342-8770-c872afb09c60-kube-api-access-w4x7w" (OuterVolumeSpecName: "kube-api-access-w4x7w") pod "cfe56c8f-5774-4342-8770-c872afb09c60" (UID: "cfe56c8f-5774-4342-8770-c872afb09c60"). InnerVolumeSpecName "kube-api-access-w4x7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.959129 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d7khg"] Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.960232 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfe56c8f-5774-4342-8770-c872afb09c60" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.960257 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe56c8f-5774-4342-8770-c872afb09c60" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.960475 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfe56c8f-5774-4342-8770-c872afb09c60" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.964031 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cfe56c8f-5774-4342-8770-c872afb09c60" (UID: "cfe56c8f-5774-4342-8770-c872afb09c60"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.968439 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.972357 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d7khg"] Mar 12 17:19:43 crc kubenswrapper[5184]: I0312 17:19:43.981331 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-inventory" (OuterVolumeSpecName: "inventory") pod "cfe56c8f-5774-4342-8770-c872afb09c60" (UID: "cfe56c8f-5774-4342-8770-c872afb09c60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.005830 5184 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.005878 5184 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfe56c8f-5774-4342-8770-c872afb09c60-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.005896 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4x7w\" (UniqueName: \"kubernetes.io/projected/cfe56c8f-5774-4342-8770-c872afb09c60-kube-api-access-w4x7w\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.108119 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mjzb\" (UniqueName: \"kubernetes.io/projected/70334f03-e2cb-4554-af53-80962e9f00bb-kube-api-access-4mjzb\") pod \"community-operators-d7khg\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.108211 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-utilities\") pod \"community-operators-d7khg\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.108315 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-catalog-content\") pod \"community-operators-d7khg\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.209903 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-catalog-content\") pod \"community-operators-d7khg\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.210338 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mjzb\" (UniqueName: \"kubernetes.io/projected/70334f03-e2cb-4554-af53-80962e9f00bb-kube-api-access-4mjzb\") pod \"community-operators-d7khg\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.210528 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-utilities\") pod \"community-operators-d7khg\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.211225 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-utilities\") pod \"community-operators-d7khg\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.211666 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-catalog-content\") pod \"community-operators-d7khg\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.230729 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mjzb\" (UniqueName: \"kubernetes.io/projected/70334f03-e2cb-4554-af53-80962e9f00bb-kube-api-access-4mjzb\") pod \"community-operators-d7khg\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.243526 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" event={"ID":"cfe56c8f-5774-4342-8770-c872afb09c60","Type":"ContainerDied","Data":"8f5a789e7ce1c208b739b531676321166ed038c1ef6ae11a5b25a206daeb98b0"} Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.243596 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f5a789e7ce1c208b739b531676321166ed038c1ef6ae11a5b25a206daeb98b0" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.243566 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xxcc" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.326513 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d"] Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.333772 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.336061 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\"" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.336177 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d"] Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.336191 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\"" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.339562 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openstack\"/\"openstack-aee-default-env\"" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.339790 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-qr8nl\"" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.353739 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.518390 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn6p4\" (UniqueName: \"kubernetes.io/projected/6894383b-a7cd-44b4-9a4d-9993eeccc10b-kube-api-access-gn6p4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.518901 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.519015 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.620740 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.620805 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.620901 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gn6p4\" (UniqueName: \"kubernetes.io/projected/6894383b-a7cd-44b4-9a4d-9993eeccc10b-kube-api-access-gn6p4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.627139 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.628901 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.648333 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn6p4\" (UniqueName: \"kubernetes.io/projected/6894383b-a7cd-44b4-9a4d-9993eeccc10b-kube-api-access-gn6p4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.652700 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:44 crc kubenswrapper[5184]: W0312 17:19:44.871887 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70334f03_e2cb_4554_af53_80962e9f00bb.slice/crio-cecd7262c2bfaba4c1ad4324f9153e22d51d5633275e209e092ccd39d76662ee WatchSource:0}: Error finding container cecd7262c2bfaba4c1ad4324f9153e22d51d5633275e209e092ccd39d76662ee: Status 404 returned error can't find the container with id cecd7262c2bfaba4c1ad4324f9153e22d51d5633275e209e092ccd39d76662ee Mar 12 17:19:44 crc kubenswrapper[5184]: I0312 17:19:44.877555 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d7khg"] Mar 12 17:19:45 crc kubenswrapper[5184]: I0312 17:19:45.203520 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d"] Mar 12 17:19:45 crc kubenswrapper[5184]: W0312 17:19:45.212618 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6894383b_a7cd_44b4_9a4d_9993eeccc10b.slice/crio-126ff94ea41af8156e87da174b14ef2458f896ed0914ba5cc6b8460a4da88260 WatchSource:0}: Error finding container 126ff94ea41af8156e87da174b14ef2458f896ed0914ba5cc6b8460a4da88260: Status 404 returned error can't find the container with id 126ff94ea41af8156e87da174b14ef2458f896ed0914ba5cc6b8460a4da88260 Mar 12 17:19:45 crc kubenswrapper[5184]: I0312 17:19:45.256190 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" event={"ID":"6894383b-a7cd-44b4-9a4d-9993eeccc10b","Type":"ContainerStarted","Data":"126ff94ea41af8156e87da174b14ef2458f896ed0914ba5cc6b8460a4da88260"} Mar 12 17:19:45 crc kubenswrapper[5184]: I0312 17:19:45.259050 5184 generic.go:358] "Generic (PLEG): container finished" podID="70334f03-e2cb-4554-af53-80962e9f00bb" containerID="d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1" exitCode=0 Mar 12 17:19:45 crc kubenswrapper[5184]: I0312 17:19:45.259114 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7khg" event={"ID":"70334f03-e2cb-4554-af53-80962e9f00bb","Type":"ContainerDied","Data":"d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1"} Mar 12 17:19:45 crc kubenswrapper[5184]: I0312 17:19:45.259168 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7khg" event={"ID":"70334f03-e2cb-4554-af53-80962e9f00bb","Type":"ContainerStarted","Data":"cecd7262c2bfaba4c1ad4324f9153e22d51d5633275e209e092ccd39d76662ee"} Mar 12 17:19:46 crc kubenswrapper[5184]: I0312 17:19:46.274186 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" event={"ID":"6894383b-a7cd-44b4-9a4d-9993eeccc10b","Type":"ContainerStarted","Data":"8fdf01bb628b1b97041555e6c7976c6159a5a93f267264f80aa0a55a02cf93ce"} Mar 12 17:19:46 crc kubenswrapper[5184]: I0312 17:19:46.303465 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" podStartSLOduration=1.8514592159999999 podStartE2EDuration="2.303439366s" podCreationTimestamp="2026-03-12 17:19:44 +0000 UTC" firstStartedPulling="2026-03-12 17:19:45.215906885 +0000 UTC m=+1727.757218224" lastFinishedPulling="2026-03-12 17:19:45.667887035 +0000 UTC m=+1728.209198374" observedRunningTime="2026-03-12 17:19:46.289538674 +0000 UTC m=+1728.830850053" watchObservedRunningTime="2026-03-12 17:19:46.303439366 +0000 UTC m=+1728.844750715" Mar 12 17:19:47 crc kubenswrapper[5184]: I0312 17:19:47.287477 5184 generic.go:358] "Generic (PLEG): container finished" podID="70334f03-e2cb-4554-af53-80962e9f00bb" containerID="cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e" exitCode=0 Mar 12 17:19:47 crc kubenswrapper[5184]: I0312 17:19:47.287566 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7khg" event={"ID":"70334f03-e2cb-4554-af53-80962e9f00bb","Type":"ContainerDied","Data":"cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e"} Mar 12 17:19:48 crc kubenswrapper[5184]: I0312 17:19:48.301087 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7khg" event={"ID":"70334f03-e2cb-4554-af53-80962e9f00bb","Type":"ContainerStarted","Data":"29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752"} Mar 12 17:19:48 crc kubenswrapper[5184]: I0312 17:19:48.323350 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d7khg" podStartSLOduration=4.520901469 podStartE2EDuration="5.323328822s" podCreationTimestamp="2026-03-12 17:19:43 +0000 UTC" firstStartedPulling="2026-03-12 17:19:45.261490852 +0000 UTC m=+1727.802802231" lastFinishedPulling="2026-03-12 17:19:46.063918245 +0000 UTC m=+1728.605229584" observedRunningTime="2026-03-12 17:19:48.319595646 +0000 UTC m=+1730.860906995" watchObservedRunningTime="2026-03-12 17:19:48.323328822 +0000 UTC m=+1730.864640171" Mar 12 17:19:49 crc kubenswrapper[5184]: E0312 17:19:49.359557 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1869113 actualBytes=10240 Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.062887 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hmwnz"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.079506 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-152a-account-create-update-qpqrz"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.089790 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-n4crd"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.099239 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mlrgv"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.107856 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7e8d-account-create-update-bqsxc"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.114852 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-26d7-account-create-update-ms2gh"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.121965 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-152a-account-create-update-qpqrz"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.129985 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hmwnz"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.137451 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mlrgv"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.144310 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-26d7-account-create-update-ms2gh"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.151484 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7e8d-account-create-update-bqsxc"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.158776 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-n4crd"] Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.413152 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c6acd8-4187-4ca0-ba38-0035df2f3d0c" path="/var/lib/kubelet/pods/06c6acd8-4187-4ca0-ba38-0035df2f3d0c/volumes" Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.414324 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef" path="/var/lib/kubelet/pods/1200cd1d-62f0-4aeb-b1b7-bb4db488e3ef/volumes" Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.415164 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="504f9845-df1d-48a7-badf-ea8ed99ff8a5" path="/var/lib/kubelet/pods/504f9845-df1d-48a7-badf-ea8ed99ff8a5/volumes" Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.416093 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712022b6-f003-4d70-bb26-978c06c35480" path="/var/lib/kubelet/pods/712022b6-f003-4d70-bb26-978c06c35480/volumes" Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.417508 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4884f23-d147-46ea-a562-1a772dbd1c21" path="/var/lib/kubelet/pods/a4884f23-d147-46ea-a562-1a772dbd1c21/volumes" Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.418245 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e00da5-efff-455c-b1a5-63ce04f03c55" path="/var/lib/kubelet/pods/c4e00da5-efff-455c-b1a5-63ce04f03c55/volumes" Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.742096 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:19:50 crc kubenswrapper[5184]: I0312 17:19:50.742458 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:19:54 crc kubenswrapper[5184]: I0312 17:19:54.354600 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:54 crc kubenswrapper[5184]: I0312 17:19:54.355300 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:54 crc kubenswrapper[5184]: I0312 17:19:54.413969 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:55 crc kubenswrapper[5184]: I0312 17:19:55.380190 5184 generic.go:358] "Generic (PLEG): container finished" podID="6894383b-a7cd-44b4-9a4d-9993eeccc10b" containerID="8fdf01bb628b1b97041555e6c7976c6159a5a93f267264f80aa0a55a02cf93ce" exitCode=0 Mar 12 17:19:55 crc kubenswrapper[5184]: I0312 17:19:55.380262 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" event={"ID":"6894383b-a7cd-44b4-9a4d-9993eeccc10b","Type":"ContainerDied","Data":"8fdf01bb628b1b97041555e6c7976c6159a5a93f267264f80aa0a55a02cf93ce"} Mar 12 17:19:55 crc kubenswrapper[5184]: I0312 17:19:55.455746 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:55 crc kubenswrapper[5184]: I0312 17:19:55.502333 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d7khg"] Mar 12 17:19:56 crc kubenswrapper[5184]: I0312 17:19:56.870660 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.013573 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-ssh-key-openstack-edpm-ipam\") pod \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.013795 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-inventory\") pod \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.013968 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn6p4\" (UniqueName: \"kubernetes.io/projected/6894383b-a7cd-44b4-9a4d-9993eeccc10b-kube-api-access-gn6p4\") pod \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\" (UID: \"6894383b-a7cd-44b4-9a4d-9993eeccc10b\") " Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.023183 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6894383b-a7cd-44b4-9a4d-9993eeccc10b-kube-api-access-gn6p4" (OuterVolumeSpecName: "kube-api-access-gn6p4") pod "6894383b-a7cd-44b4-9a4d-9993eeccc10b" (UID: "6894383b-a7cd-44b4-9a4d-9993eeccc10b"). InnerVolumeSpecName "kube-api-access-gn6p4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.049231 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-inventory" (OuterVolumeSpecName: "inventory") pod "6894383b-a7cd-44b4-9a4d-9993eeccc10b" (UID: "6894383b-a7cd-44b4-9a4d-9993eeccc10b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.054330 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6894383b-a7cd-44b4-9a4d-9993eeccc10b" (UID: "6894383b-a7cd-44b4-9a4d-9993eeccc10b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.116421 5184 reconciler_common.go:299] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.116464 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gn6p4\" (UniqueName: \"kubernetes.io/projected/6894383b-a7cd-44b4-9a4d-9993eeccc10b-kube-api-access-gn6p4\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.116478 5184 reconciler_common.go:299] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6894383b-a7cd-44b4-9a4d-9993eeccc10b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.404544 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.404573 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d" event={"ID":"6894383b-a7cd-44b4-9a4d-9993eeccc10b","Type":"ContainerDied","Data":"126ff94ea41af8156e87da174b14ef2458f896ed0914ba5cc6b8460a4da88260"} Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.405082 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="126ff94ea41af8156e87da174b14ef2458f896ed0914ba5cc6b8460a4da88260" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.405301 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d7khg" podUID="70334f03-e2cb-4554-af53-80962e9f00bb" containerName="registry-server" containerID="cri-o://29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752" gracePeriod=2 Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.762898 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.828442 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mjzb\" (UniqueName: \"kubernetes.io/projected/70334f03-e2cb-4554-af53-80962e9f00bb-kube-api-access-4mjzb\") pod \"70334f03-e2cb-4554-af53-80962e9f00bb\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.828734 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-utilities\") pod \"70334f03-e2cb-4554-af53-80962e9f00bb\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.828816 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-catalog-content\") pod \"70334f03-e2cb-4554-af53-80962e9f00bb\" (UID: \"70334f03-e2cb-4554-af53-80962e9f00bb\") " Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.830188 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-utilities" (OuterVolumeSpecName: "utilities") pod "70334f03-e2cb-4554-af53-80962e9f00bb" (UID: "70334f03-e2cb-4554-af53-80962e9f00bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.837582 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70334f03-e2cb-4554-af53-80962e9f00bb-kube-api-access-4mjzb" (OuterVolumeSpecName: "kube-api-access-4mjzb") pod "70334f03-e2cb-4554-af53-80962e9f00bb" (UID: "70334f03-e2cb-4554-af53-80962e9f00bb"). InnerVolumeSpecName "kube-api-access-4mjzb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.877883 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70334f03-e2cb-4554-af53-80962e9f00bb" (UID: "70334f03-e2cb-4554-af53-80962e9f00bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.931325 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.931457 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mjzb\" (UniqueName: \"kubernetes.io/projected/70334f03-e2cb-4554-af53-80962e9f00bb-kube-api-access-4mjzb\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:57 crc kubenswrapper[5184]: I0312 17:19:57.931492 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70334f03-e2cb-4554-af53-80962e9f00bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.440318 5184 generic.go:358] "Generic (PLEG): container finished" podID="70334f03-e2cb-4554-af53-80962e9f00bb" containerID="29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752" exitCode=0 Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.440415 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7khg" event={"ID":"70334f03-e2cb-4554-af53-80962e9f00bb","Type":"ContainerDied","Data":"29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752"} Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.440542 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7khg" event={"ID":"70334f03-e2cb-4554-af53-80962e9f00bb","Type":"ContainerDied","Data":"cecd7262c2bfaba4c1ad4324f9153e22d51d5633275e209e092ccd39d76662ee"} Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.440583 5184 scope.go:117] "RemoveContainer" containerID="29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752" Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.441556 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7khg" Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.497734 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d7khg"] Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.498424 5184 scope.go:117] "RemoveContainer" containerID="cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e" Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.509005 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d7khg"] Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.521226 5184 scope.go:117] "RemoveContainer" containerID="d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1" Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.562555 5184 scope.go:117] "RemoveContainer" containerID="29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752" Mar 12 17:19:58 crc kubenswrapper[5184]: E0312 17:19:58.562909 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752\": container with ID starting with 29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752 not found: ID does not exist" containerID="29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752" Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.562934 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752"} err="failed to get container status \"29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752\": rpc error: code = NotFound desc = could not find container \"29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752\": container with ID starting with 29ddc23119157b88ae1902b567432957651dffe285f6afcd8bb5b4865f3bf752 not found: ID does not exist" Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.562951 5184 scope.go:117] "RemoveContainer" containerID="cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e" Mar 12 17:19:58 crc kubenswrapper[5184]: E0312 17:19:58.563127 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e\": container with ID starting with cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e not found: ID does not exist" containerID="cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e" Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.563147 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e"} err="failed to get container status \"cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e\": rpc error: code = NotFound desc = could not find container \"cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e\": container with ID starting with cdb76f5adef2f2001879977979b6fbdbb794f3e112c7f722e8eb17087e80766e not found: ID does not exist" Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.563160 5184 scope.go:117] "RemoveContainer" containerID="d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1" Mar 12 17:19:58 crc kubenswrapper[5184]: E0312 17:19:58.563315 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1\": container with ID starting with d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1 not found: ID does not exist" containerID="d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1" Mar 12 17:19:58 crc kubenswrapper[5184]: I0312 17:19:58.563335 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1"} err="failed to get container status \"d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1\": rpc error: code = NotFound desc = could not find container \"d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1\": container with ID starting with d49c520cb3b15bbe1c372b0dd2b082722dbe60505dd2ef57c55c95f1f1c9d3b1 not found: ID does not exist" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.130598 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555600-tct96"] Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.131744 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70334f03-e2cb-4554-af53-80962e9f00bb" containerName="extract-utilities" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.131757 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="70334f03-e2cb-4554-af53-80962e9f00bb" containerName="extract-utilities" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.131766 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70334f03-e2cb-4554-af53-80962e9f00bb" containerName="extract-content" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.131772 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="70334f03-e2cb-4554-af53-80962e9f00bb" containerName="extract-content" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.131796 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6894383b-a7cd-44b4-9a4d-9993eeccc10b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.131802 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="6894383b-a7cd-44b4-9a4d-9993eeccc10b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.131812 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70334f03-e2cb-4554-af53-80962e9f00bb" containerName="registry-server" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.131818 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="70334f03-e2cb-4554-af53-80962e9f00bb" containerName="registry-server" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.132050 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="70334f03-e2cb-4554-af53-80962e9f00bb" containerName="registry-server" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.132073 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="6894383b-a7cd-44b4-9a4d-9993eeccc10b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.138582 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555600-tct96" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.142290 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.142722 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.143927 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.158090 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555600-tct96"] Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.179803 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr2hm\" (UniqueName: \"kubernetes.io/projected/811de9fa-562c-4bd2-83bd-70d15937198b-kube-api-access-kr2hm\") pod \"auto-csr-approver-29555600-tct96\" (UID: \"811de9fa-562c-4bd2-83bd-70d15937198b\") " pod="openshift-infra/auto-csr-approver-29555600-tct96" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.283284 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kr2hm\" (UniqueName: \"kubernetes.io/projected/811de9fa-562c-4bd2-83bd-70d15937198b-kube-api-access-kr2hm\") pod \"auto-csr-approver-29555600-tct96\" (UID: \"811de9fa-562c-4bd2-83bd-70d15937198b\") " pod="openshift-infra/auto-csr-approver-29555600-tct96" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.311791 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr2hm\" (UniqueName: \"kubernetes.io/projected/811de9fa-562c-4bd2-83bd-70d15937198b-kube-api-access-kr2hm\") pod \"auto-csr-approver-29555600-tct96\" (UID: \"811de9fa-562c-4bd2-83bd-70d15937198b\") " pod="openshift-infra/auto-csr-approver-29555600-tct96" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.409853 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70334f03-e2cb-4554-af53-80962e9f00bb" path="/var/lib/kubelet/pods/70334f03-e2cb-4554-af53-80962e9f00bb/volumes" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.456753 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555600-tct96" Mar 12 17:20:00 crc kubenswrapper[5184]: I0312 17:20:00.968483 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555600-tct96"] Mar 12 17:20:01 crc kubenswrapper[5184]: I0312 17:20:01.475473 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555600-tct96" event={"ID":"811de9fa-562c-4bd2-83bd-70d15937198b","Type":"ContainerStarted","Data":"90c4f28a41b10bb7038630134f4cff27d95f3c85249974160cadf5e306297bfc"} Mar 12 17:20:03 crc kubenswrapper[5184]: I0312 17:20:03.493581 5184 generic.go:358] "Generic (PLEG): container finished" podID="811de9fa-562c-4bd2-83bd-70d15937198b" containerID="d623a21c615df6f0df347c0edea54904aa5387f635eb051e05efe1885f923cf4" exitCode=0 Mar 12 17:20:03 crc kubenswrapper[5184]: I0312 17:20:03.493657 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555600-tct96" event={"ID":"811de9fa-562c-4bd2-83bd-70d15937198b","Type":"ContainerDied","Data":"d623a21c615df6f0df347c0edea54904aa5387f635eb051e05efe1885f923cf4"} Mar 12 17:20:04 crc kubenswrapper[5184]: I0312 17:20:04.913744 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555600-tct96" Mar 12 17:20:05 crc kubenswrapper[5184]: I0312 17:20:05.038830 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr2hm\" (UniqueName: \"kubernetes.io/projected/811de9fa-562c-4bd2-83bd-70d15937198b-kube-api-access-kr2hm\") pod \"811de9fa-562c-4bd2-83bd-70d15937198b\" (UID: \"811de9fa-562c-4bd2-83bd-70d15937198b\") " Mar 12 17:20:05 crc kubenswrapper[5184]: I0312 17:20:05.048157 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811de9fa-562c-4bd2-83bd-70d15937198b-kube-api-access-kr2hm" (OuterVolumeSpecName: "kube-api-access-kr2hm") pod "811de9fa-562c-4bd2-83bd-70d15937198b" (UID: "811de9fa-562c-4bd2-83bd-70d15937198b"). InnerVolumeSpecName "kube-api-access-kr2hm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:20:05 crc kubenswrapper[5184]: I0312 17:20:05.143764 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kr2hm\" (UniqueName: \"kubernetes.io/projected/811de9fa-562c-4bd2-83bd-70d15937198b-kube-api-access-kr2hm\") on node \"crc\" DevicePath \"\"" Mar 12 17:20:05 crc kubenswrapper[5184]: I0312 17:20:05.519417 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555600-tct96" event={"ID":"811de9fa-562c-4bd2-83bd-70d15937198b","Type":"ContainerDied","Data":"90c4f28a41b10bb7038630134f4cff27d95f3c85249974160cadf5e306297bfc"} Mar 12 17:20:05 crc kubenswrapper[5184]: I0312 17:20:05.519786 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c4f28a41b10bb7038630134f4cff27d95f3c85249974160cadf5e306297bfc" Mar 12 17:20:05 crc kubenswrapper[5184]: I0312 17:20:05.519484 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555600-tct96" Mar 12 17:20:06 crc kubenswrapper[5184]: I0312 17:20:06.009059 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555594-6csj4"] Mar 12 17:20:06 crc kubenswrapper[5184]: I0312 17:20:06.022494 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555594-6csj4"] Mar 12 17:20:06 crc kubenswrapper[5184]: I0312 17:20:06.412768 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f" path="/var/lib/kubelet/pods/02d00a19-eeb1-4bc6-a42c-b2bd721cdf6f/volumes" Mar 12 17:20:19 crc kubenswrapper[5184]: I0312 17:20:19.945504 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t9x9b/must-gather-brf99"] Mar 12 17:20:19 crc kubenswrapper[5184]: I0312 17:20:19.947551 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="811de9fa-562c-4bd2-83bd-70d15937198b" containerName="oc" Mar 12 17:20:19 crc kubenswrapper[5184]: I0312 17:20:19.947575 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="811de9fa-562c-4bd2-83bd-70d15937198b" containerName="oc" Mar 12 17:20:19 crc kubenswrapper[5184]: I0312 17:20:19.947802 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="811de9fa-562c-4bd2-83bd-70d15937198b" containerName="oc" Mar 12 17:20:19 crc kubenswrapper[5184]: I0312 17:20:19.965496 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/must-gather-brf99" Mar 12 17:20:19 crc kubenswrapper[5184]: I0312 17:20:19.969184 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t9x9b/must-gather-brf99"] Mar 12 17:20:19 crc kubenswrapper[5184]: I0312 17:20:19.969661 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t9x9b\"/\"kube-root-ca.crt\"" Mar 12 17:20:19 crc kubenswrapper[5184]: I0312 17:20:19.970234 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t9x9b\"/\"openshift-service-ca.crt\"" Mar 12 17:20:19 crc kubenswrapper[5184]: I0312 17:20:19.970484 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-t9x9b\"/\"default-dockercfg-k2t5f\"" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.033172 5184 scope.go:117] "RemoveContainer" containerID="230d9d524a767785af5ad2d095b8dff4b409d7d0584498d2fadf7203b751c2d9" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.063398 5184 scope.go:117] "RemoveContainer" containerID="0b3952293217e4718a41a9999e52a1e27c9d2143a6b3ebc866a5030a19190c7a" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.103128 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-must-gather-output\") pod \"must-gather-brf99\" (UID: \"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5\") " pod="openshift-must-gather-t9x9b/must-gather-brf99" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.103267 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfz4h\" (UniqueName: \"kubernetes.io/projected/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-kube-api-access-xfz4h\") pod \"must-gather-brf99\" (UID: \"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5\") " pod="openshift-must-gather-t9x9b/must-gather-brf99" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.166959 5184 scope.go:117] "RemoveContainer" containerID="c73147534990a5f5924559fb3d15c0624e0be82b7c3fa440913e8452f0cabac3" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.187331 5184 scope.go:117] "RemoveContainer" containerID="60dd861274f121fd40ead82085684d0b9356e4bf2b7d30cf355b408cfb655b9b" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.205032 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfz4h\" (UniqueName: \"kubernetes.io/projected/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-kube-api-access-xfz4h\") pod \"must-gather-brf99\" (UID: \"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5\") " pod="openshift-must-gather-t9x9b/must-gather-brf99" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.205129 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-must-gather-output\") pod \"must-gather-brf99\" (UID: \"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5\") " pod="openshift-must-gather-t9x9b/must-gather-brf99" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.205561 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-must-gather-output\") pod \"must-gather-brf99\" (UID: \"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5\") " pod="openshift-must-gather-t9x9b/must-gather-brf99" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.226214 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfz4h\" (UniqueName: \"kubernetes.io/projected/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-kube-api-access-xfz4h\") pod \"must-gather-brf99\" (UID: \"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5\") " pod="openshift-must-gather-t9x9b/must-gather-brf99" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.232168 5184 scope.go:117] "RemoveContainer" containerID="7a5d9d7da184cc41d907b47d6a4ff6d494e923c3052137edfdca564769ae4879" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.292894 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/must-gather-brf99" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.320007 5184 scope.go:117] "RemoveContainer" containerID="1d6d4a170aec1af79b1e21bb010771337c81d445023e4689a46e58b41e30d707" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.350264 5184 scope.go:117] "RemoveContainer" containerID="e688ebc45cb33ef5815d84d4e5465b82de772dcd29933526acfdf86706fc333e" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.742056 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.742567 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.742647 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.743731 5184 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924"} pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.743849 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" containerID="cri-o://003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" gracePeriod=600 Mar 12 17:20:20 crc kubenswrapper[5184]: I0312 17:20:20.772415 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t9x9b/must-gather-brf99"] Mar 12 17:20:20 crc kubenswrapper[5184]: W0312 17:20:20.786323 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b8dacc7_9e4a_4fd4_98f7_72a7c5ad78d5.slice/crio-1bc22adadcdd32456b9726ff7951c92122bea5d8c2ca1e4d354723a1b9dc617a WatchSource:0}: Error finding container 1bc22adadcdd32456b9726ff7951c92122bea5d8c2ca1e4d354723a1b9dc617a: Status 404 returned error can't find the container with id 1bc22adadcdd32456b9726ff7951c92122bea5d8c2ca1e4d354723a1b9dc617a Mar 12 17:20:20 crc kubenswrapper[5184]: E0312 17:20:20.870860 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:20:21 crc kubenswrapper[5184]: I0312 17:20:21.713986 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9x9b/must-gather-brf99" event={"ID":"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5","Type":"ContainerStarted","Data":"1bc22adadcdd32456b9726ff7951c92122bea5d8c2ca1e4d354723a1b9dc617a"} Mar 12 17:20:21 crc kubenswrapper[5184]: I0312 17:20:21.716784 5184 generic.go:358] "Generic (PLEG): container finished" podID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" exitCode=0 Mar 12 17:20:21 crc kubenswrapper[5184]: I0312 17:20:21.716858 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerDied","Data":"003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924"} Mar 12 17:20:21 crc kubenswrapper[5184]: I0312 17:20:21.716906 5184 scope.go:117] "RemoveContainer" containerID="7539a33836cf02bc296a008cedc3ee58f1ef87b38ca5b5f9414731708b87618f" Mar 12 17:20:21 crc kubenswrapper[5184]: I0312 17:20:21.717530 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:20:21 crc kubenswrapper[5184]: E0312 17:20:21.717901 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:20:25 crc kubenswrapper[5184]: I0312 17:20:25.946821 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sdmlv"] Mar 12 17:20:25 crc kubenswrapper[5184]: I0312 17:20:25.957258 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sdmlv"] Mar 12 17:20:25 crc kubenswrapper[5184]: I0312 17:20:25.957416 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:26 crc kubenswrapper[5184]: I0312 17:20:26.120069 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-catalog-content\") pod \"certified-operators-sdmlv\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:26 crc kubenswrapper[5184]: I0312 17:20:26.120189 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bphhz\" (UniqueName: \"kubernetes.io/projected/49154c9c-ad18-415b-af29-690a06ef7fdb-kube-api-access-bphhz\") pod \"certified-operators-sdmlv\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:26 crc kubenswrapper[5184]: I0312 17:20:26.120250 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-utilities\") pod \"certified-operators-sdmlv\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:26 crc kubenswrapper[5184]: I0312 17:20:26.222115 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-catalog-content\") pod \"certified-operators-sdmlv\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:26 crc kubenswrapper[5184]: I0312 17:20:26.222207 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bphhz\" (UniqueName: \"kubernetes.io/projected/49154c9c-ad18-415b-af29-690a06ef7fdb-kube-api-access-bphhz\") pod \"certified-operators-sdmlv\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:26 crc kubenswrapper[5184]: I0312 17:20:26.222227 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-utilities\") pod \"certified-operators-sdmlv\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:26 crc kubenswrapper[5184]: I0312 17:20:26.222770 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-utilities\") pod \"certified-operators-sdmlv\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:26 crc kubenswrapper[5184]: I0312 17:20:26.222981 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-catalog-content\") pod \"certified-operators-sdmlv\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:26 crc kubenswrapper[5184]: I0312 17:20:26.242867 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bphhz\" (UniqueName: \"kubernetes.io/projected/49154c9c-ad18-415b-af29-690a06ef7fdb-kube-api-access-bphhz\") pod \"certified-operators-sdmlv\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:26 crc kubenswrapper[5184]: I0312 17:20:26.286961 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:27 crc kubenswrapper[5184]: I0312 17:20:27.067635 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sdmlv"] Mar 12 17:20:27 crc kubenswrapper[5184]: I0312 17:20:27.794778 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9x9b/must-gather-brf99" event={"ID":"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5","Type":"ContainerStarted","Data":"766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f"} Mar 12 17:20:27 crc kubenswrapper[5184]: I0312 17:20:27.795536 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9x9b/must-gather-brf99" event={"ID":"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5","Type":"ContainerStarted","Data":"e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288"} Mar 12 17:20:27 crc kubenswrapper[5184]: I0312 17:20:27.800812 5184 generic.go:358] "Generic (PLEG): container finished" podID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerID="d4cb0856ef823e1e6037b52bff58b452ff8a9cdc4425e2356b0e95bac300e494" exitCode=0 Mar 12 17:20:27 crc kubenswrapper[5184]: I0312 17:20:27.801038 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdmlv" event={"ID":"49154c9c-ad18-415b-af29-690a06ef7fdb","Type":"ContainerDied","Data":"d4cb0856ef823e1e6037b52bff58b452ff8a9cdc4425e2356b0e95bac300e494"} Mar 12 17:20:27 crc kubenswrapper[5184]: I0312 17:20:27.801083 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdmlv" event={"ID":"49154c9c-ad18-415b-af29-690a06ef7fdb","Type":"ContainerStarted","Data":"05ef0f138583c36f3e0ce40bae46c2f91ea723f94e829d798c68969f85b425dd"} Mar 12 17:20:27 crc kubenswrapper[5184]: I0312 17:20:27.816289 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t9x9b/must-gather-brf99" podStartSLOduration=2.900823043 podStartE2EDuration="8.816270444s" podCreationTimestamp="2026-03-12 17:20:19 +0000 UTC" firstStartedPulling="2026-03-12 17:20:20.78974596 +0000 UTC m=+1763.331057339" lastFinishedPulling="2026-03-12 17:20:26.705193401 +0000 UTC m=+1769.246504740" observedRunningTime="2026-03-12 17:20:27.815366026 +0000 UTC m=+1770.356677375" watchObservedRunningTime="2026-03-12 17:20:27.816270444 +0000 UTC m=+1770.357581793" Mar 12 17:20:28 crc kubenswrapper[5184]: I0312 17:20:28.060366 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n22b8"] Mar 12 17:20:28 crc kubenswrapper[5184]: I0312 17:20:28.078835 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n22b8"] Mar 12 17:20:28 crc kubenswrapper[5184]: I0312 17:20:28.414518 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7a82d0-151a-40b3-86b4-79aff3a3b0be" path="/var/lib/kubelet/pods/2a7a82d0-151a-40b3-86b4-79aff3a3b0be/volumes" Mar 12 17:20:29 crc kubenswrapper[5184]: I0312 17:20:29.827317 5184 generic.go:358] "Generic (PLEG): container finished" podID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerID="5c04a57161ff813a96e1e8c38e9575adcfd1ca868959b07a607a8ecfcb3349c1" exitCode=0 Mar 12 17:20:29 crc kubenswrapper[5184]: I0312 17:20:29.827430 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdmlv" event={"ID":"49154c9c-ad18-415b-af29-690a06ef7fdb","Type":"ContainerDied","Data":"5c04a57161ff813a96e1e8c38e9575adcfd1ca868959b07a607a8ecfcb3349c1"} Mar 12 17:20:30 crc kubenswrapper[5184]: I0312 17:20:30.731392 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t9x9b/crc-debug-m9l2f"] Mar 12 17:20:30 crc kubenswrapper[5184]: I0312 17:20:30.872975 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdmlv" event={"ID":"49154c9c-ad18-415b-af29-690a06ef7fdb","Type":"ContainerStarted","Data":"0ffad3bd26acad10481405162995acae4575f4090b02ede5c7b6b799c8a229fd"} Mar 12 17:20:30 crc kubenswrapper[5184]: I0312 17:20:30.873096 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" Mar 12 17:20:30 crc kubenswrapper[5184]: I0312 17:20:30.903304 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sdmlv" podStartSLOduration=4.967728292 podStartE2EDuration="5.903280206s" podCreationTimestamp="2026-03-12 17:20:25 +0000 UTC" firstStartedPulling="2026-03-12 17:20:27.802462125 +0000 UTC m=+1770.343773484" lastFinishedPulling="2026-03-12 17:20:28.738014049 +0000 UTC m=+1771.279325398" observedRunningTime="2026-03-12 17:20:30.899799829 +0000 UTC m=+1773.441111168" watchObservedRunningTime="2026-03-12 17:20:30.903280206 +0000 UTC m=+1773.444591545" Mar 12 17:20:30 crc kubenswrapper[5184]: I0312 17:20:30.928218 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/216978f1-3c9a-4a2b-aa02-66106ea49411-host\") pod \"crc-debug-m9l2f\" (UID: \"216978f1-3c9a-4a2b-aa02-66106ea49411\") " pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" Mar 12 17:20:30 crc kubenswrapper[5184]: I0312 17:20:30.928591 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k824j\" (UniqueName: \"kubernetes.io/projected/216978f1-3c9a-4a2b-aa02-66106ea49411-kube-api-access-k824j\") pod \"crc-debug-m9l2f\" (UID: \"216978f1-3c9a-4a2b-aa02-66106ea49411\") " pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" Mar 12 17:20:31 crc kubenswrapper[5184]: I0312 17:20:31.030997 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k824j\" (UniqueName: \"kubernetes.io/projected/216978f1-3c9a-4a2b-aa02-66106ea49411-kube-api-access-k824j\") pod \"crc-debug-m9l2f\" (UID: \"216978f1-3c9a-4a2b-aa02-66106ea49411\") " pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" Mar 12 17:20:31 crc kubenswrapper[5184]: I0312 17:20:31.031130 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/216978f1-3c9a-4a2b-aa02-66106ea49411-host\") pod \"crc-debug-m9l2f\" (UID: \"216978f1-3c9a-4a2b-aa02-66106ea49411\") " pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" Mar 12 17:20:31 crc kubenswrapper[5184]: I0312 17:20:31.031252 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/216978f1-3c9a-4a2b-aa02-66106ea49411-host\") pod \"crc-debug-m9l2f\" (UID: \"216978f1-3c9a-4a2b-aa02-66106ea49411\") " pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" Mar 12 17:20:31 crc kubenswrapper[5184]: I0312 17:20:31.060226 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k824j\" (UniqueName: \"kubernetes.io/projected/216978f1-3c9a-4a2b-aa02-66106ea49411-kube-api-access-k824j\") pod \"crc-debug-m9l2f\" (UID: \"216978f1-3c9a-4a2b-aa02-66106ea49411\") " pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" Mar 12 17:20:31 crc kubenswrapper[5184]: I0312 17:20:31.190093 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" Mar 12 17:20:31 crc kubenswrapper[5184]: W0312 17:20:31.230003 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod216978f1_3c9a_4a2b_aa02_66106ea49411.slice/crio-46da8279234db2831c9153cea93fd5a42583728303f36557b53ce1e812c00722 WatchSource:0}: Error finding container 46da8279234db2831c9153cea93fd5a42583728303f36557b53ce1e812c00722: Status 404 returned error can't find the container with id 46da8279234db2831c9153cea93fd5a42583728303f36557b53ce1e812c00722 Mar 12 17:20:31 crc kubenswrapper[5184]: I0312 17:20:31.849257 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" event={"ID":"216978f1-3c9a-4a2b-aa02-66106ea49411","Type":"ContainerStarted","Data":"46da8279234db2831c9153cea93fd5a42583728303f36557b53ce1e812c00722"} Mar 12 17:20:35 crc kubenswrapper[5184]: I0312 17:20:35.399974 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:20:35 crc kubenswrapper[5184]: E0312 17:20:35.400662 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:20:36 crc kubenswrapper[5184]: I0312 17:20:36.287496 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:36 crc kubenswrapper[5184]: I0312 17:20:36.287847 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:36 crc kubenswrapper[5184]: I0312 17:20:36.341514 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:36 crc kubenswrapper[5184]: I0312 17:20:36.956928 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.009580 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sdmlv"] Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.220853 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5ggmx"] Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.238634 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.240491 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ggmx"] Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.404330 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-catalog-content\") pod \"redhat-marketplace-5ggmx\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.404436 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvz47\" (UniqueName: \"kubernetes.io/projected/c67705f8-1bcd-412c-afa7-94298483ac87-kube-api-access-tvz47\") pod \"redhat-marketplace-5ggmx\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.404461 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-utilities\") pod \"redhat-marketplace-5ggmx\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.506087 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvz47\" (UniqueName: \"kubernetes.io/projected/c67705f8-1bcd-412c-afa7-94298483ac87-kube-api-access-tvz47\") pod \"redhat-marketplace-5ggmx\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.506138 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-utilities\") pod \"redhat-marketplace-5ggmx\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.506264 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-catalog-content\") pod \"redhat-marketplace-5ggmx\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.506755 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-catalog-content\") pod \"redhat-marketplace-5ggmx\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.506838 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-utilities\") pod \"redhat-marketplace-5ggmx\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.528839 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvz47\" (UniqueName: \"kubernetes.io/projected/c67705f8-1bcd-412c-afa7-94298483ac87-kube-api-access-tvz47\") pod \"redhat-marketplace-5ggmx\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.585766 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:38 crc kubenswrapper[5184]: I0312 17:20:38.925575 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sdmlv" podUID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerName="registry-server" containerID="cri-o://0ffad3bd26acad10481405162995acae4575f4090b02ede5c7b6b799c8a229fd" gracePeriod=2 Mar 12 17:20:39 crc kubenswrapper[5184]: I0312 17:20:39.935128 5184 generic.go:358] "Generic (PLEG): container finished" podID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerID="0ffad3bd26acad10481405162995acae4575f4090b02ede5c7b6b799c8a229fd" exitCode=0 Mar 12 17:20:39 crc kubenswrapper[5184]: I0312 17:20:39.935184 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdmlv" event={"ID":"49154c9c-ad18-415b-af29-690a06ef7fdb","Type":"ContainerDied","Data":"0ffad3bd26acad10481405162995acae4575f4090b02ede5c7b6b799c8a229fd"} Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.411434 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.475667 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-catalog-content\") pod \"49154c9c-ad18-415b-af29-690a06ef7fdb\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.476001 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-utilities\") pod \"49154c9c-ad18-415b-af29-690a06ef7fdb\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.476028 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bphhz\" (UniqueName: \"kubernetes.io/projected/49154c9c-ad18-415b-af29-690a06ef7fdb-kube-api-access-bphhz\") pod \"49154c9c-ad18-415b-af29-690a06ef7fdb\" (UID: \"49154c9c-ad18-415b-af29-690a06ef7fdb\") " Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.476435 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-utilities" (OuterVolumeSpecName: "utilities") pod "49154c9c-ad18-415b-af29-690a06ef7fdb" (UID: "49154c9c-ad18-415b-af29-690a06ef7fdb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.482629 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49154c9c-ad18-415b-af29-690a06ef7fdb-kube-api-access-bphhz" (OuterVolumeSpecName: "kube-api-access-bphhz") pod "49154c9c-ad18-415b-af29-690a06ef7fdb" (UID: "49154c9c-ad18-415b-af29-690a06ef7fdb"). InnerVolumeSpecName "kube-api-access-bphhz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.521200 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49154c9c-ad18-415b-af29-690a06ef7fdb" (UID: "49154c9c-ad18-415b-af29-690a06ef7fdb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.578136 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.578205 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49154c9c-ad18-415b-af29-690a06ef7fdb-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.578225 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bphhz\" (UniqueName: \"kubernetes.io/projected/49154c9c-ad18-415b-af29-690a06ef7fdb-kube-api-access-bphhz\") on node \"crc\" DevicePath \"\"" Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.610445 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ggmx"] Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.961007 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdmlv" Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.961033 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdmlv" event={"ID":"49154c9c-ad18-415b-af29-690a06ef7fdb","Type":"ContainerDied","Data":"05ef0f138583c36f3e0ce40bae46c2f91ea723f94e829d798c68969f85b425dd"} Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.961366 5184 scope.go:117] "RemoveContainer" containerID="0ffad3bd26acad10481405162995acae4575f4090b02ede5c7b6b799c8a229fd" Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.964143 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" event={"ID":"216978f1-3c9a-4a2b-aa02-66106ea49411","Type":"ContainerStarted","Data":"ff8c9d0104d5f925dfe6854fddac0a4dccbf66ca5a0e13bace1d0fc580643e18"} Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.970650 5184 generic.go:358] "Generic (PLEG): container finished" podID="c67705f8-1bcd-412c-afa7-94298483ac87" containerID="b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b" exitCode=0 Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.970829 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ggmx" event={"ID":"c67705f8-1bcd-412c-afa7-94298483ac87","Type":"ContainerDied","Data":"b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b"} Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.970861 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ggmx" event={"ID":"c67705f8-1bcd-412c-afa7-94298483ac87","Type":"ContainerStarted","Data":"f9673575f9abe3c2891fc47dadc8fb6991884486d90b94264c77cba0abe97f17"} Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.995043 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" podStartSLOduration=2.035529819 podStartE2EDuration="12.99502501s" podCreationTimestamp="2026-03-12 17:20:30 +0000 UTC" firstStartedPulling="2026-03-12 17:20:31.234149019 +0000 UTC m=+1773.775460358" lastFinishedPulling="2026-03-12 17:20:42.19364421 +0000 UTC m=+1784.734955549" observedRunningTime="2026-03-12 17:20:42.986850656 +0000 UTC m=+1785.528162005" watchObservedRunningTime="2026-03-12 17:20:42.99502501 +0000 UTC m=+1785.536336339" Mar 12 17:20:42 crc kubenswrapper[5184]: I0312 17:20:42.995185 5184 scope.go:117] "RemoveContainer" containerID="5c04a57161ff813a96e1e8c38e9575adcfd1ca868959b07a607a8ecfcb3349c1" Mar 12 17:20:43 crc kubenswrapper[5184]: I0312 17:20:43.026637 5184 scope.go:117] "RemoveContainer" containerID="d4cb0856ef823e1e6037b52bff58b452ff8a9cdc4425e2356b0e95bac300e494" Mar 12 17:20:43 crc kubenswrapper[5184]: I0312 17:20:43.053473 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sdmlv"] Mar 12 17:20:43 crc kubenswrapper[5184]: I0312 17:20:43.065774 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sdmlv"] Mar 12 17:20:43 crc kubenswrapper[5184]: I0312 17:20:43.988647 5184 generic.go:358] "Generic (PLEG): container finished" podID="c67705f8-1bcd-412c-afa7-94298483ac87" containerID="5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1" exitCode=0 Mar 12 17:20:43 crc kubenswrapper[5184]: I0312 17:20:43.988751 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ggmx" event={"ID":"c67705f8-1bcd-412c-afa7-94298483ac87","Type":"ContainerDied","Data":"5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1"} Mar 12 17:20:44 crc kubenswrapper[5184]: I0312 17:20:44.411236 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49154c9c-ad18-415b-af29-690a06ef7fdb" path="/var/lib/kubelet/pods/49154c9c-ad18-415b-af29-690a06ef7fdb/volumes" Mar 12 17:20:45 crc kubenswrapper[5184]: I0312 17:20:45.017404 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ggmx" event={"ID":"c67705f8-1bcd-412c-afa7-94298483ac87","Type":"ContainerStarted","Data":"84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680"} Mar 12 17:20:45 crc kubenswrapper[5184]: I0312 17:20:45.041243 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5ggmx" podStartSLOduration=6.369183088 podStartE2EDuration="7.041225184s" podCreationTimestamp="2026-03-12 17:20:38 +0000 UTC" firstStartedPulling="2026-03-12 17:20:42.971732125 +0000 UTC m=+1785.513043464" lastFinishedPulling="2026-03-12 17:20:43.643774221 +0000 UTC m=+1786.185085560" observedRunningTime="2026-03-12 17:20:45.03625846 +0000 UTC m=+1787.577569799" watchObservedRunningTime="2026-03-12 17:20:45.041225184 +0000 UTC m=+1787.582536523" Mar 12 17:20:47 crc kubenswrapper[5184]: I0312 17:20:47.032135 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-z94sz"] Mar 12 17:20:47 crc kubenswrapper[5184]: I0312 17:20:47.041149 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-z94sz"] Mar 12 17:20:47 crc kubenswrapper[5184]: I0312 17:20:47.400299 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:20:47 crc kubenswrapper[5184]: E0312 17:20:47.400689 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:20:48 crc kubenswrapper[5184]: I0312 17:20:48.413680 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e6df0a-f516-4f34-bd84-d32cea82a0ed" path="/var/lib/kubelet/pods/c9e6df0a-f516-4f34-bd84-d32cea82a0ed/volumes" Mar 12 17:20:48 crc kubenswrapper[5184]: I0312 17:20:48.586201 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:48 crc kubenswrapper[5184]: I0312 17:20:48.586247 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:48 crc kubenswrapper[5184]: I0312 17:20:48.634456 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:48 crc kubenswrapper[5184]: E0312 17:20:48.948905 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1868914 actualBytes=10240 Mar 12 17:20:49 crc kubenswrapper[5184]: I0312 17:20:49.043777 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zgwz8"] Mar 12 17:20:49 crc kubenswrapper[5184]: I0312 17:20:49.057398 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zgwz8"] Mar 12 17:20:49 crc kubenswrapper[5184]: I0312 17:20:49.110106 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:50 crc kubenswrapper[5184]: I0312 17:20:50.010424 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ggmx"] Mar 12 17:20:50 crc kubenswrapper[5184]: I0312 17:20:50.415472 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad" path="/var/lib/kubelet/pods/95c57d5a-7431-4ea7-b69b-f5f7ee50b3ad/volumes" Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.086138 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5ggmx" podUID="c67705f8-1bcd-412c-afa7-94298483ac87" containerName="registry-server" containerID="cri-o://84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680" gracePeriod=2 Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.575338 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.672321 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-utilities\") pod \"c67705f8-1bcd-412c-afa7-94298483ac87\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.672569 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-catalog-content\") pod \"c67705f8-1bcd-412c-afa7-94298483ac87\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.672790 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvz47\" (UniqueName: \"kubernetes.io/projected/c67705f8-1bcd-412c-afa7-94298483ac87-kube-api-access-tvz47\") pod \"c67705f8-1bcd-412c-afa7-94298483ac87\" (UID: \"c67705f8-1bcd-412c-afa7-94298483ac87\") " Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.673409 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-utilities" (OuterVolumeSpecName: "utilities") pod "c67705f8-1bcd-412c-afa7-94298483ac87" (UID: "c67705f8-1bcd-412c-afa7-94298483ac87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.679992 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c67705f8-1bcd-412c-afa7-94298483ac87-kube-api-access-tvz47" (OuterVolumeSpecName: "kube-api-access-tvz47") pod "c67705f8-1bcd-412c-afa7-94298483ac87" (UID: "c67705f8-1bcd-412c-afa7-94298483ac87"). InnerVolumeSpecName "kube-api-access-tvz47". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.705089 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c67705f8-1bcd-412c-afa7-94298483ac87" (UID: "c67705f8-1bcd-412c-afa7-94298483ac87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.776093 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.776141 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tvz47\" (UniqueName: \"kubernetes.io/projected/c67705f8-1bcd-412c-afa7-94298483ac87-kube-api-access-tvz47\") on node \"crc\" DevicePath \"\"" Mar 12 17:20:51 crc kubenswrapper[5184]: I0312 17:20:51.776157 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c67705f8-1bcd-412c-afa7-94298483ac87-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.097167 5184 generic.go:358] "Generic (PLEG): container finished" podID="c67705f8-1bcd-412c-afa7-94298483ac87" containerID="84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680" exitCode=0 Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.097300 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ggmx" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.097305 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ggmx" event={"ID":"c67705f8-1bcd-412c-afa7-94298483ac87","Type":"ContainerDied","Data":"84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680"} Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.097388 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ggmx" event={"ID":"c67705f8-1bcd-412c-afa7-94298483ac87","Type":"ContainerDied","Data":"f9673575f9abe3c2891fc47dadc8fb6991884486d90b94264c77cba0abe97f17"} Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.097426 5184 scope.go:117] "RemoveContainer" containerID="84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.130611 5184 scope.go:117] "RemoveContainer" containerID="5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.138690 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ggmx"] Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.147515 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ggmx"] Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.152567 5184 scope.go:117] "RemoveContainer" containerID="b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.194976 5184 scope.go:117] "RemoveContainer" containerID="84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680" Mar 12 17:20:52 crc kubenswrapper[5184]: E0312 17:20:52.197477 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680\": container with ID starting with 84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680 not found: ID does not exist" containerID="84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.197526 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680"} err="failed to get container status \"84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680\": rpc error: code = NotFound desc = could not find container \"84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680\": container with ID starting with 84093e8005a595bc541bcddf5a576e62f040b59070f7b7da563e97645951a680 not found: ID does not exist" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.197556 5184 scope.go:117] "RemoveContainer" containerID="5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1" Mar 12 17:20:52 crc kubenswrapper[5184]: E0312 17:20:52.197899 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1\": container with ID starting with 5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1 not found: ID does not exist" containerID="5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.197922 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1"} err="failed to get container status \"5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1\": rpc error: code = NotFound desc = could not find container \"5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1\": container with ID starting with 5169c2828334dd5ca5c59017d9c25bb2280dc0f33286630479b7937f2a4eb5f1 not found: ID does not exist" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.197938 5184 scope.go:117] "RemoveContainer" containerID="b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b" Mar 12 17:20:52 crc kubenswrapper[5184]: E0312 17:20:52.198162 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b\": container with ID starting with b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b not found: ID does not exist" containerID="b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.198182 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b"} err="failed to get container status \"b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b\": rpc error: code = NotFound desc = could not find container \"b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b\": container with ID starting with b30d6ee4dcbfd3855e6e780bfc7911568cc909fd8f28be8357a63cce1e9dcb2b not found: ID does not exist" Mar 12 17:20:52 crc kubenswrapper[5184]: I0312 17:20:52.414825 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c67705f8-1bcd-412c-afa7-94298483ac87" path="/var/lib/kubelet/pods/c67705f8-1bcd-412c-afa7-94298483ac87/volumes" Mar 12 17:20:59 crc kubenswrapper[5184]: I0312 17:20:59.185676 5184 generic.go:358] "Generic (PLEG): container finished" podID="216978f1-3c9a-4a2b-aa02-66106ea49411" containerID="ff8c9d0104d5f925dfe6854fddac0a4dccbf66ca5a0e13bace1d0fc580643e18" exitCode=0 Mar 12 17:20:59 crc kubenswrapper[5184]: I0312 17:20:59.185860 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" event={"ID":"216978f1-3c9a-4a2b-aa02-66106ea49411","Type":"ContainerDied","Data":"ff8c9d0104d5f925dfe6854fddac0a4dccbf66ca5a0e13bace1d0fc580643e18"} Mar 12 17:20:59 crc kubenswrapper[5184]: I0312 17:20:59.307132 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:20:59 crc kubenswrapper[5184]: I0312 17:20:59.307574 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:20:59 crc kubenswrapper[5184]: I0312 17:20:59.315566 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:20:59 crc kubenswrapper[5184]: I0312 17:20:59.315858 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:21:00 crc kubenswrapper[5184]: I0312 17:21:00.341154 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" Mar 12 17:21:00 crc kubenswrapper[5184]: I0312 17:21:00.374501 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t9x9b/crc-debug-m9l2f"] Mar 12 17:21:00 crc kubenswrapper[5184]: I0312 17:21:00.383029 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t9x9b/crc-debug-m9l2f"] Mar 12 17:21:00 crc kubenswrapper[5184]: I0312 17:21:00.449539 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/216978f1-3c9a-4a2b-aa02-66106ea49411-host\") pod \"216978f1-3c9a-4a2b-aa02-66106ea49411\" (UID: \"216978f1-3c9a-4a2b-aa02-66106ea49411\") " Mar 12 17:21:00 crc kubenswrapper[5184]: I0312 17:21:00.449651 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/216978f1-3c9a-4a2b-aa02-66106ea49411-host" (OuterVolumeSpecName: "host") pod "216978f1-3c9a-4a2b-aa02-66106ea49411" (UID: "216978f1-3c9a-4a2b-aa02-66106ea49411"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:21:00 crc kubenswrapper[5184]: I0312 17:21:00.449799 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k824j\" (UniqueName: \"kubernetes.io/projected/216978f1-3c9a-4a2b-aa02-66106ea49411-kube-api-access-k824j\") pod \"216978f1-3c9a-4a2b-aa02-66106ea49411\" (UID: \"216978f1-3c9a-4a2b-aa02-66106ea49411\") " Mar 12 17:21:00 crc kubenswrapper[5184]: I0312 17:21:00.450253 5184 reconciler_common.go:299] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/216978f1-3c9a-4a2b-aa02-66106ea49411-host\") on node \"crc\" DevicePath \"\"" Mar 12 17:21:00 crc kubenswrapper[5184]: I0312 17:21:00.459449 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/216978f1-3c9a-4a2b-aa02-66106ea49411-kube-api-access-k824j" (OuterVolumeSpecName: "kube-api-access-k824j") pod "216978f1-3c9a-4a2b-aa02-66106ea49411" (UID: "216978f1-3c9a-4a2b-aa02-66106ea49411"). InnerVolumeSpecName "kube-api-access-k824j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:21:00 crc kubenswrapper[5184]: I0312 17:21:00.551630 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k824j\" (UniqueName: \"kubernetes.io/projected/216978f1-3c9a-4a2b-aa02-66106ea49411-kube-api-access-k824j\") on node \"crc\" DevicePath \"\"" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.205899 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/crc-debug-m9l2f" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.205969 5184 scope.go:117] "RemoveContainer" containerID="ff8c9d0104d5f925dfe6854fddac0a4dccbf66ca5a0e13bace1d0fc580643e18" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.400038 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:21:01 crc kubenswrapper[5184]: E0312 17:21:01.400571 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.626227 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t9x9b/crc-debug-nndpr"] Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627225 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c67705f8-1bcd-412c-afa7-94298483ac87" containerName="extract-utilities" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627247 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67705f8-1bcd-412c-afa7-94298483ac87" containerName="extract-utilities" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627262 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerName="extract-utilities" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627268 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerName="extract-utilities" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627288 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c67705f8-1bcd-412c-afa7-94298483ac87" containerName="extract-content" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627296 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67705f8-1bcd-412c-afa7-94298483ac87" containerName="extract-content" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627308 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="216978f1-3c9a-4a2b-aa02-66106ea49411" containerName="container-00" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627313 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="216978f1-3c9a-4a2b-aa02-66106ea49411" containerName="container-00" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627339 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerName="extract-content" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627344 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerName="extract-content" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627354 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerName="registry-server" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627359 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerName="registry-server" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627412 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c67705f8-1bcd-412c-afa7-94298483ac87" containerName="registry-server" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627417 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67705f8-1bcd-412c-afa7-94298483ac87" containerName="registry-server" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627581 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="49154c9c-ad18-415b-af29-690a06ef7fdb" containerName="registry-server" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627600 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="c67705f8-1bcd-412c-afa7-94298483ac87" containerName="registry-server" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.627614 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="216978f1-3c9a-4a2b-aa02-66106ea49411" containerName="container-00" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.636908 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/crc-debug-nndpr" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.775128 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbg2g\" (UniqueName: \"kubernetes.io/projected/0a8c4758-de8a-4315-a862-fd4d813534d5-kube-api-access-gbg2g\") pod \"crc-debug-nndpr\" (UID: \"0a8c4758-de8a-4315-a862-fd4d813534d5\") " pod="openshift-must-gather-t9x9b/crc-debug-nndpr" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.775571 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a8c4758-de8a-4315-a862-fd4d813534d5-host\") pod \"crc-debug-nndpr\" (UID: \"0a8c4758-de8a-4315-a862-fd4d813534d5\") " pod="openshift-must-gather-t9x9b/crc-debug-nndpr" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.877746 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbg2g\" (UniqueName: \"kubernetes.io/projected/0a8c4758-de8a-4315-a862-fd4d813534d5-kube-api-access-gbg2g\") pod \"crc-debug-nndpr\" (UID: \"0a8c4758-de8a-4315-a862-fd4d813534d5\") " pod="openshift-must-gather-t9x9b/crc-debug-nndpr" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.877892 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a8c4758-de8a-4315-a862-fd4d813534d5-host\") pod \"crc-debug-nndpr\" (UID: \"0a8c4758-de8a-4315-a862-fd4d813534d5\") " pod="openshift-must-gather-t9x9b/crc-debug-nndpr" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.878043 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a8c4758-de8a-4315-a862-fd4d813534d5-host\") pod \"crc-debug-nndpr\" (UID: \"0a8c4758-de8a-4315-a862-fd4d813534d5\") " pod="openshift-must-gather-t9x9b/crc-debug-nndpr" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.902832 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbg2g\" (UniqueName: \"kubernetes.io/projected/0a8c4758-de8a-4315-a862-fd4d813534d5-kube-api-access-gbg2g\") pod \"crc-debug-nndpr\" (UID: \"0a8c4758-de8a-4315-a862-fd4d813534d5\") " pod="openshift-must-gather-t9x9b/crc-debug-nndpr" Mar 12 17:21:01 crc kubenswrapper[5184]: I0312 17:21:01.959182 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/crc-debug-nndpr" Mar 12 17:21:01 crc kubenswrapper[5184]: W0312 17:21:01.989146 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a8c4758_de8a_4315_a862_fd4d813534d5.slice/crio-c26ad11daebbf4cd2121dbab50f5d0bdb8224a23c31ba5296b67073e26549c3d WatchSource:0}: Error finding container c26ad11daebbf4cd2121dbab50f5d0bdb8224a23c31ba5296b67073e26549c3d: Status 404 returned error can't find the container with id c26ad11daebbf4cd2121dbab50f5d0bdb8224a23c31ba5296b67073e26549c3d Mar 12 17:21:02 crc kubenswrapper[5184]: I0312 17:21:02.217294 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9x9b/crc-debug-nndpr" event={"ID":"0a8c4758-de8a-4315-a862-fd4d813534d5","Type":"ContainerStarted","Data":"c26ad11daebbf4cd2121dbab50f5d0bdb8224a23c31ba5296b67073e26549c3d"} Mar 12 17:21:02 crc kubenswrapper[5184]: I0312 17:21:02.409658 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="216978f1-3c9a-4a2b-aa02-66106ea49411" path="/var/lib/kubelet/pods/216978f1-3c9a-4a2b-aa02-66106ea49411/volumes" Mar 12 17:21:03 crc kubenswrapper[5184]: I0312 17:21:03.229622 5184 generic.go:358] "Generic (PLEG): container finished" podID="0a8c4758-de8a-4315-a862-fd4d813534d5" containerID="01be60f54ed2a15f1bf65f63553a343fce9c421ae655609f9835baade8c150f0" exitCode=1 Mar 12 17:21:03 crc kubenswrapper[5184]: I0312 17:21:03.229740 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9x9b/crc-debug-nndpr" event={"ID":"0a8c4758-de8a-4315-a862-fd4d813534d5","Type":"ContainerDied","Data":"01be60f54ed2a15f1bf65f63553a343fce9c421ae655609f9835baade8c150f0"} Mar 12 17:21:03 crc kubenswrapper[5184]: I0312 17:21:03.274289 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t9x9b/crc-debug-nndpr"] Mar 12 17:21:03 crc kubenswrapper[5184]: I0312 17:21:03.281808 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t9x9b/crc-debug-nndpr"] Mar 12 17:21:04 crc kubenswrapper[5184]: I0312 17:21:04.344167 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/crc-debug-nndpr" Mar 12 17:21:04 crc kubenswrapper[5184]: I0312 17:21:04.345904 5184 status_manager.go:895] "Failed to get status for pod" podUID="0a8c4758-de8a-4315-a862-fd4d813534d5" pod="openshift-must-gather-t9x9b/crc-debug-nndpr" err="pods \"crc-debug-nndpr\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-t9x9b\": no relationship found between node 'crc' and this object" Mar 12 17:21:04 crc kubenswrapper[5184]: I0312 17:21:04.433744 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbg2g\" (UniqueName: \"kubernetes.io/projected/0a8c4758-de8a-4315-a862-fd4d813534d5-kube-api-access-gbg2g\") pod \"0a8c4758-de8a-4315-a862-fd4d813534d5\" (UID: \"0a8c4758-de8a-4315-a862-fd4d813534d5\") " Mar 12 17:21:04 crc kubenswrapper[5184]: I0312 17:21:04.434020 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a8c4758-de8a-4315-a862-fd4d813534d5-host\") pod \"0a8c4758-de8a-4315-a862-fd4d813534d5\" (UID: \"0a8c4758-de8a-4315-a862-fd4d813534d5\") " Mar 12 17:21:04 crc kubenswrapper[5184]: I0312 17:21:04.434583 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a8c4758-de8a-4315-a862-fd4d813534d5-host" (OuterVolumeSpecName: "host") pod "0a8c4758-de8a-4315-a862-fd4d813534d5" (UID: "0a8c4758-de8a-4315-a862-fd4d813534d5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 12 17:21:04 crc kubenswrapper[5184]: I0312 17:21:04.446395 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8c4758-de8a-4315-a862-fd4d813534d5-kube-api-access-gbg2g" (OuterVolumeSpecName: "kube-api-access-gbg2g") pod "0a8c4758-de8a-4315-a862-fd4d813534d5" (UID: "0a8c4758-de8a-4315-a862-fd4d813534d5"). InnerVolumeSpecName "kube-api-access-gbg2g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:21:04 crc kubenswrapper[5184]: I0312 17:21:04.536572 5184 reconciler_common.go:299] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a8c4758-de8a-4315-a862-fd4d813534d5-host\") on node \"crc\" DevicePath \"\"" Mar 12 17:21:04 crc kubenswrapper[5184]: I0312 17:21:04.536617 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gbg2g\" (UniqueName: \"kubernetes.io/projected/0a8c4758-de8a-4315-a862-fd4d813534d5-kube-api-access-gbg2g\") on node \"crc\" DevicePath \"\"" Mar 12 17:21:05 crc kubenswrapper[5184]: I0312 17:21:05.252407 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/crc-debug-nndpr" Mar 12 17:21:05 crc kubenswrapper[5184]: I0312 17:21:05.252444 5184 scope.go:117] "RemoveContainer" containerID="01be60f54ed2a15f1bf65f63553a343fce9c421ae655609f9835baade8c150f0" Mar 12 17:21:06 crc kubenswrapper[5184]: I0312 17:21:06.410352 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8c4758-de8a-4315-a862-fd4d813534d5" path="/var/lib/kubelet/pods/0a8c4758-de8a-4315-a862-fd4d813534d5/volumes" Mar 12 17:21:15 crc kubenswrapper[5184]: I0312 17:21:15.400898 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:21:15 crc kubenswrapper[5184]: E0312 17:21:15.402166 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:21:20 crc kubenswrapper[5184]: I0312 17:21:20.527755 5184 scope.go:117] "RemoveContainer" containerID="9b12b25da6650591dec1111324062f5144b3e13fa578769a3ae5102967141a09" Mar 12 17:21:20 crc kubenswrapper[5184]: I0312 17:21:20.587757 5184 scope.go:117] "RemoveContainer" containerID="6171b7e72d22708f031e33283a02382b94a60513db25d5c575c7fe34ea59191d" Mar 12 17:21:20 crc kubenswrapper[5184]: I0312 17:21:20.637148 5184 scope.go:117] "RemoveContainer" containerID="d436d59aca7fa024eb05f51de8407da845441cd17d53483239b41fef1e87b97a" Mar 12 17:21:26 crc kubenswrapper[5184]: I0312 17:21:26.399454 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:21:26 crc kubenswrapper[5184]: E0312 17:21:26.400288 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:21:33 crc kubenswrapper[5184]: I0312 17:21:33.052977 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-t8d4k"] Mar 12 17:21:33 crc kubenswrapper[5184]: I0312 17:21:33.063048 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-t8d4k"] Mar 12 17:21:34 crc kubenswrapper[5184]: I0312 17:21:34.414284 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7acbc2-7b1e-4156-b9d8-c4172585c2e1" path="/var/lib/kubelet/pods/cf7acbc2-7b1e-4156-b9d8-c4172585c2e1/volumes" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.015257 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-66c64686b6-kwvcj_916be5af-0ed0-4d16-a8b8-2d01b7b81dab/barbican-api/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.162784 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-66c64686b6-kwvcj_916be5af-0ed0-4d16-a8b8-2d01b7b81dab/barbican-api-log/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.194259 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-84f8b689b8-ptbnb_dfe68657-8277-4927-9fef-88807e21461b/barbican-keystone-listener/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.273341 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-84f8b689b8-ptbnb_dfe68657-8277-4927-9fef-88807e21461b/barbican-keystone-listener-log/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.372433 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-574685bb47-lxzsh_55402375-344d-4d12-b750-7a3e784b3886/barbican-worker/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.399495 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:21:39 crc kubenswrapper[5184]: E0312 17:21:39.400082 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.420930 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-574685bb47-lxzsh_55402375-344d-4d12-b750-7a3e784b3886/barbican-worker-log/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.554509 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-czdtb_00325ca6-5bba-4ac7-8ef7-0b21163fe2af/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.609909 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_45b25746-82f1-4bdf-8246-2f8ef1514dba/ceilometer-central-agent/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.719252 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_45b25746-82f1-4bdf-8246-2f8ef1514dba/proxy-httpd/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.734534 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_45b25746-82f1-4bdf-8246-2f8ef1514dba/ceilometer-notification-agent/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.803879 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_45b25746-82f1-4bdf-8246-2f8ef1514dba/sg-core/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.905531 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-6kllr_af3748c5-3b07-43f8-a444-fb48032538b0/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 17:21:39 crc kubenswrapper[5184]: I0312 17:21:39.996496 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb/cinder-api/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.040553 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c5271e23-2cf6-47fe-a7ce-df2cfc24cbbb/cinder-api-log/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.158447 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3f28e94c-3d7e-4cfd-b230-f8939eb1e78f/cinder-scheduler/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.186691 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3f28e94c-3d7e-4cfd-b230-f8939eb1e78f/probe/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.308560 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-5msp4_0d0d6421-c7d8-4f60-ad4c-e6049d1a4d78/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.416126 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ks5mt_288f0750-6757-4078-b6b4-5283c4b54d41/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.511122 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-579578d6d7-xfplt_772747ff-b4c7-4fda-a596-6aa41c6c46cd/init/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.695791 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-579578d6d7-xfplt_772747ff-b4c7-4fda-a596-6aa41c6c46cd/init/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.699445 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-579578d6d7-xfplt_772747ff-b4c7-4fda-a596-6aa41c6c46cd/dnsmasq-dns/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.730096 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_79cef421-d05c-4274-8c92-337f4b818bff/glance-httpd/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.871651 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_79cef421-d05c-4274-8c92-337f4b818bff/glance-log/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.918080 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6f0e3724-623c-4453-8b13-623e9daf508d/glance-httpd/0.log" Mar 12 17:21:40 crc kubenswrapper[5184]: I0312 17:21:40.976464 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6f0e3724-623c-4453-8b13-623e9daf508d/glance-log/0.log" Mar 12 17:21:41 crc kubenswrapper[5184]: I0312 17:21:41.229930 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7cd5c99b94-hgvbf_a97bd24c-a292-45a7-af77-526fb65b807d/horizon/0.log" Mar 12 17:21:41 crc kubenswrapper[5184]: I0312 17:21:41.238836 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7cd5c99b94-hgvbf_a97bd24c-a292-45a7-af77-526fb65b807d/horizon-log/0.log" Mar 12 17:21:41 crc kubenswrapper[5184]: I0312 17:21:41.345875 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rlmmf_d60cd34b-9fd7-49e9-a3e0-8c4d2794fb1f/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 17:21:41 crc kubenswrapper[5184]: I0312 17:21:41.543755 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-698bbfd847-dsp75_3f2c8323-78ff-4f44-9741-b5564424b6c2/keystone-api/0.log" Mar 12 17:21:41 crc kubenswrapper[5184]: I0312 17:21:41.567459 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_418fa381-87e7-47c8-9136-6c5d8d8028ce/kube-state-metrics/0.log" Mar 12 17:21:41 crc kubenswrapper[5184]: I0312 17:21:41.854708 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7949fc945c-bns65_dd31119b-8094-4cd5-ad64-6786cd8c7dbe/neutron-api/0.log" Mar 12 17:21:41 crc kubenswrapper[5184]: I0312 17:21:41.936589 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7949fc945c-bns65_dd31119b-8094-4cd5-ad64-6786cd8c7dbe/neutron-httpd/0.log" Mar 12 17:21:42 crc kubenswrapper[5184]: I0312 17:21:42.247253 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_56d8fe13-8a18-4a96-bf19-21711cd1d931/nova-api-log/0.log" Mar 12 17:21:42 crc kubenswrapper[5184]: I0312 17:21:42.269313 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_56d8fe13-8a18-4a96-bf19-21711cd1d931/nova-api-api/0.log" Mar 12 17:21:42 crc kubenswrapper[5184]: I0312 17:21:42.477933 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4d274afd-6ab1-4652-8093-c3941b617a98/nova-cell0-conductor-conductor/0.log" Mar 12 17:21:42 crc kubenswrapper[5184]: I0312 17:21:42.567848 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_75247eae-d2ef-43a9-a5fc-66bc1f351feb/nova-cell1-conductor-conductor/0.log" Mar 12 17:21:42 crc kubenswrapper[5184]: I0312 17:21:42.680571 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d0e5f35a-1622-4ec0-84e7-56af1f798978/nova-cell1-novncproxy-novncproxy/0.log" Mar 12 17:21:42 crc kubenswrapper[5184]: I0312 17:21:42.854520 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e8950bf1-dd92-4d86-be29-29f807a65ee1/nova-metadata-log/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.048249 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_7dbb540e-524b-4e3c-b2b0-b7019085f4ae/nova-scheduler-scheduler/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.103700 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e8950bf1-dd92-4d86-be29-29f807a65ee1/nova-metadata-metadata/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.156185 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9c743985-027b-46df-8a0d-5a246406a2d3/mysql-bootstrap/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.339526 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9c743985-027b-46df-8a0d-5a246406a2d3/galera/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.347608 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9c743985-027b-46df-8a0d-5a246406a2d3/mysql-bootstrap/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.393254 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_043ee884-91ea-43b8-8b26-c8e85e3df303/mysql-bootstrap/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.557281 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_043ee884-91ea-43b8-8b26-c8e85e3df303/mysql-bootstrap/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.627640 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_043ee884-91ea-43b8-8b26-c8e85e3df303/galera/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.653411 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3c0d941e-36d1-4112-8488-a27d08ec0a8b/openstackclient/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.818506 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dq7bv_d5a0c031-5c42-4559-96f2-82b75e70b804/ovn-controller/0.log" Mar 12 17:21:43 crc kubenswrapper[5184]: I0312 17:21:43.879194 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nx6zg_afd5d8ed-916e-4ba1-bbe8-bcc7989cdbe8/openstack-network-exporter/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.059472 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vp7v2_303ab57c-305d-48c2-a789-7a124144968d/ovsdb-server-init/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.262882 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vp7v2_303ab57c-305d-48c2-a789-7a124144968d/ovsdb-server/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.264301 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vp7v2_303ab57c-305d-48c2-a789-7a124144968d/ovsdb-server-init/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.326541 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vp7v2_303ab57c-305d-48c2-a789-7a124144968d/ovs-vswitchd/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.461746 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a455b874-8c77-4293-be4f-4379f4fecf49/openstack-network-exporter/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.461857 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a455b874-8c77-4293-be4f-4379f4fecf49/ovn-northd/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.622710 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7755768b-45c3-4cab-be56-d9be437d70d1/openstack-network-exporter/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.691066 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7755768b-45c3-4cab-be56-d9be437d70d1/ovsdbserver-nb/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.801256 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_22558944-a035-4296-855e-53505b918f08/openstack-network-exporter/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.811934 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_22558944-a035-4296-855e-53505b918f08/ovsdbserver-sb/0.log" Mar 12 17:21:44 crc kubenswrapper[5184]: I0312 17:21:44.967277 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5b65568c5b-pr7s4_d77574d5-b3c7-434b-8499-c38b3e2886e8/placement-api/0.log" Mar 12 17:21:45 crc kubenswrapper[5184]: I0312 17:21:45.014614 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5b65568c5b-pr7s4_d77574d5-b3c7-434b-8499-c38b3e2886e8/placement-log/0.log" Mar 12 17:21:45 crc kubenswrapper[5184]: I0312 17:21:45.130670 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4f29a0e3-8a0d-42dd-b7f8-d9123e29035b/setup-container/0.log" Mar 12 17:21:45 crc kubenswrapper[5184]: I0312 17:21:45.335515 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4f29a0e3-8a0d-42dd-b7f8-d9123e29035b/rabbitmq/0.log" Mar 12 17:21:45 crc kubenswrapper[5184]: I0312 17:21:45.373579 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4f29a0e3-8a0d-42dd-b7f8-d9123e29035b/setup-container/0.log" Mar 12 17:21:45 crc kubenswrapper[5184]: I0312 17:21:45.422217 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_474ecd3e-3438-4cf1-953e-115dcbc40119/setup-container/0.log" Mar 12 17:21:45 crc kubenswrapper[5184]: I0312 17:21:45.617437 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_474ecd3e-3438-4cf1-953e-115dcbc40119/setup-container/0.log" Mar 12 17:21:45 crc kubenswrapper[5184]: I0312 17:21:45.648782 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-hrp4d_6894383b-a7cd-44b4-9a4d-9993eeccc10b/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 17:21:45 crc kubenswrapper[5184]: I0312 17:21:45.730083 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_474ecd3e-3438-4cf1-953e-115dcbc40119/rabbitmq/0.log" Mar 12 17:21:45 crc kubenswrapper[5184]: I0312 17:21:45.876515 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-h59fv_37d0ff5d-5881-483f-ab1f-ff2385c623ad/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 17:21:45 crc kubenswrapper[5184]: I0312 17:21:45.953768 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7xxcc_cfe56c8f-5774-4342-8770-c872afb09c60/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.088938 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xbm7s_3fc025dd-36df-47a5-9728-667e76180934/ssh-known-hosts-edpm-deployment/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.234940 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-756587dd69-bfms9_4ac8b1de-edf1-4663-b45b-677f2cd049eb/proxy-httpd/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.305643 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-756587dd69-bfms9_4ac8b1de-edf1-4663-b45b-677f2cd049eb/proxy-server/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.384069 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-nwctk_0353bd4c-727d-4c46-8954-29b25872ba5a/swift-ring-rebalance/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.482002 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/account-auditor/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.532047 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/account-reaper/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.621656 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/account-replicator/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.664763 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/account-server/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.731686 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/container-auditor/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.758649 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/container-replicator/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.824996 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/container-server/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.877420 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/container-updater/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.943165 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/object-auditor/0.log" Mar 12 17:21:46 crc kubenswrapper[5184]: I0312 17:21:46.962871 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/object-expirer/0.log" Mar 12 17:21:47 crc kubenswrapper[5184]: I0312 17:21:47.044510 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/object-replicator/0.log" Mar 12 17:21:47 crc kubenswrapper[5184]: I0312 17:21:47.087910 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/object-server/0.log" Mar 12 17:21:47 crc kubenswrapper[5184]: I0312 17:21:47.137121 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/object-updater/0.log" Mar 12 17:21:47 crc kubenswrapper[5184]: I0312 17:21:47.201584 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/rsync/0.log" Mar 12 17:21:47 crc kubenswrapper[5184]: I0312 17:21:47.290917 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_2ffae81b-589d-4502-a0a6-777b8d6f98b1/swift-recon-cron/0.log" Mar 12 17:21:47 crc kubenswrapper[5184]: I0312 17:21:47.416989 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-tszw2_7239abca-a6d9-4694-8cf0-36bd97160cf9/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 17:21:49 crc kubenswrapper[5184]: E0312 17:21:49.189621 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1793861 actualBytes=10240 Mar 12 17:21:50 crc kubenswrapper[5184]: I0312 17:21:50.400560 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:21:50 crc kubenswrapper[5184]: E0312 17:21:50.403294 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:21:50 crc kubenswrapper[5184]: I0312 17:21:50.792502 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_01a0600d-d61f-4822-a177-fbe86d075f38/memcached/0.log" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.148170 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555602-tsg2b"] Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.150675 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a8c4758-de8a-4315-a862-fd4d813534d5" containerName="container-00" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.150703 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8c4758-de8a-4315-a862-fd4d813534d5" containerName="container-00" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.151113 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a8c4758-de8a-4315-a862-fd4d813534d5" containerName="container-00" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.166247 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555602-tsg2b"] Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.166505 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555602-tsg2b" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.169809 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.172688 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.172914 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.207591 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd2hq\" (UniqueName: \"kubernetes.io/projected/d0a50cab-4eef-4f28-a0b3-4df157d40732-kube-api-access-hd2hq\") pod \"auto-csr-approver-29555602-tsg2b\" (UID: \"d0a50cab-4eef-4f28-a0b3-4df157d40732\") " pod="openshift-infra/auto-csr-approver-29555602-tsg2b" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.310558 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd2hq\" (UniqueName: \"kubernetes.io/projected/d0a50cab-4eef-4f28-a0b3-4df157d40732-kube-api-access-hd2hq\") pod \"auto-csr-approver-29555602-tsg2b\" (UID: \"d0a50cab-4eef-4f28-a0b3-4df157d40732\") " pod="openshift-infra/auto-csr-approver-29555602-tsg2b" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.331945 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd2hq\" (UniqueName: \"kubernetes.io/projected/d0a50cab-4eef-4f28-a0b3-4df157d40732-kube-api-access-hd2hq\") pod \"auto-csr-approver-29555602-tsg2b\" (UID: \"d0a50cab-4eef-4f28-a0b3-4df157d40732\") " pod="openshift-infra/auto-csr-approver-29555602-tsg2b" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.508236 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555602-tsg2b" Mar 12 17:22:00 crc kubenswrapper[5184]: I0312 17:22:00.943199 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555602-tsg2b"] Mar 12 17:22:00 crc kubenswrapper[5184]: W0312 17:22:00.945700 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a50cab_4eef_4f28_a0b3_4df157d40732.slice/crio-6f46911288dc2f31f9cccd8056acd9dfc2b6e9976203c9ebd1d73df0625adc14 WatchSource:0}: Error finding container 6f46911288dc2f31f9cccd8056acd9dfc2b6e9976203c9ebd1d73df0625adc14: Status 404 returned error can't find the container with id 6f46911288dc2f31f9cccd8056acd9dfc2b6e9976203c9ebd1d73df0625adc14 Mar 12 17:22:01 crc kubenswrapper[5184]: I0312 17:22:01.772827 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555602-tsg2b" event={"ID":"d0a50cab-4eef-4f28-a0b3-4df157d40732","Type":"ContainerStarted","Data":"6f46911288dc2f31f9cccd8056acd9dfc2b6e9976203c9ebd1d73df0625adc14"} Mar 12 17:22:02 crc kubenswrapper[5184]: I0312 17:22:02.410602 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:22:02 crc kubenswrapper[5184]: E0312 17:22:02.411536 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:22:02 crc kubenswrapper[5184]: I0312 17:22:02.786728 5184 generic.go:358] "Generic (PLEG): container finished" podID="d0a50cab-4eef-4f28-a0b3-4df157d40732" containerID="5e073420f15bb5280368407b650d7ff45e5c46a435a13acc813dbc396ce93ffd" exitCode=0 Mar 12 17:22:02 crc kubenswrapper[5184]: I0312 17:22:02.786857 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555602-tsg2b" event={"ID":"d0a50cab-4eef-4f28-a0b3-4df157d40732","Type":"ContainerDied","Data":"5e073420f15bb5280368407b650d7ff45e5c46a435a13acc813dbc396ce93ffd"} Mar 12 17:22:04 crc kubenswrapper[5184]: I0312 17:22:04.153709 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555602-tsg2b" Mar 12 17:22:04 crc kubenswrapper[5184]: I0312 17:22:04.183563 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd2hq\" (UniqueName: \"kubernetes.io/projected/d0a50cab-4eef-4f28-a0b3-4df157d40732-kube-api-access-hd2hq\") pod \"d0a50cab-4eef-4f28-a0b3-4df157d40732\" (UID: \"d0a50cab-4eef-4f28-a0b3-4df157d40732\") " Mar 12 17:22:04 crc kubenswrapper[5184]: I0312 17:22:04.191982 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a50cab-4eef-4f28-a0b3-4df157d40732-kube-api-access-hd2hq" (OuterVolumeSpecName: "kube-api-access-hd2hq") pod "d0a50cab-4eef-4f28-a0b3-4df157d40732" (UID: "d0a50cab-4eef-4f28-a0b3-4df157d40732"). InnerVolumeSpecName "kube-api-access-hd2hq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:22:04 crc kubenswrapper[5184]: I0312 17:22:04.285782 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hd2hq\" (UniqueName: \"kubernetes.io/projected/d0a50cab-4eef-4f28-a0b3-4df157d40732-kube-api-access-hd2hq\") on node \"crc\" DevicePath \"\"" Mar 12 17:22:04 crc kubenswrapper[5184]: I0312 17:22:04.810311 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555602-tsg2b" Mar 12 17:22:04 crc kubenswrapper[5184]: I0312 17:22:04.810518 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555602-tsg2b" event={"ID":"d0a50cab-4eef-4f28-a0b3-4df157d40732","Type":"ContainerDied","Data":"6f46911288dc2f31f9cccd8056acd9dfc2b6e9976203c9ebd1d73df0625adc14"} Mar 12 17:22:04 crc kubenswrapper[5184]: I0312 17:22:04.810877 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f46911288dc2f31f9cccd8056acd9dfc2b6e9976203c9ebd1d73df0625adc14" Mar 12 17:22:05 crc kubenswrapper[5184]: I0312 17:22:05.254615 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555596-t4wg8"] Mar 12 17:22:05 crc kubenswrapper[5184]: I0312 17:22:05.265013 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555596-t4wg8"] Mar 12 17:22:06 crc kubenswrapper[5184]: I0312 17:22:06.419585 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f69d7985-6165-4cb8-8e7a-8ffd819b0243" path="/var/lib/kubelet/pods/f69d7985-6165-4cb8-8e7a-8ffd819b0243/volumes" Mar 12 17:22:09 crc kubenswrapper[5184]: I0312 17:22:09.657554 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h_01a6e05d-ea1c-47f7-a88c-073127e41f25/util/0.log" Mar 12 17:22:09 crc kubenswrapper[5184]: I0312 17:22:09.815756 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h_01a6e05d-ea1c-47f7-a88c-073127e41f25/util/0.log" Mar 12 17:22:09 crc kubenswrapper[5184]: I0312 17:22:09.851786 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h_01a6e05d-ea1c-47f7-a88c-073127e41f25/pull/0.log" Mar 12 17:22:09 crc kubenswrapper[5184]: I0312 17:22:09.865003 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h_01a6e05d-ea1c-47f7-a88c-073127e41f25/pull/0.log" Mar 12 17:22:10 crc kubenswrapper[5184]: I0312 17:22:10.000643 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h_01a6e05d-ea1c-47f7-a88c-073127e41f25/util/0.log" Mar 12 17:22:10 crc kubenswrapper[5184]: I0312 17:22:10.018334 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h_01a6e05d-ea1c-47f7-a88c-073127e41f25/extract/0.log" Mar 12 17:22:10 crc kubenswrapper[5184]: I0312 17:22:10.038549 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecebrb5h_01a6e05d-ea1c-47f7-a88c-073127e41f25/pull/0.log" Mar 12 17:22:10 crc kubenswrapper[5184]: I0312 17:22:10.438477 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-c845c877d-sh7h4_71cd8922-2260-4302-b49f-8ebb0084bc3a/manager/0.log" Mar 12 17:22:10 crc kubenswrapper[5184]: I0312 17:22:10.746499 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-6f84c59bb4-9vdqk_9aacc6d0-007b-4eff-95c0-1e6347226980/manager/0.log" Mar 12 17:22:10 crc kubenswrapper[5184]: I0312 17:22:10.863710 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d9587945-trclj_d1faa83d-1fb7-4c0a-8358-6b02b46d6c9f/manager/0.log" Mar 12 17:22:11 crc kubenswrapper[5184]: I0312 17:22:11.077225 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-776f58c496-nxd4w_b264a369-29b1-4524-b1ef-ea0d61042e1b/manager/0.log" Mar 12 17:22:11 crc kubenswrapper[5184]: I0312 17:22:11.496444 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-579966755f-k6ws5_4efa9263-cab7-4221-b570-90c929ebf82b/manager/0.log" Mar 12 17:22:11 crc kubenswrapper[5184]: I0312 17:22:11.743150 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6564988d95-s8gjk_ba537fdf-14f9-47e1-a8c6-4732c4d9dfeb/manager/0.log" Mar 12 17:22:11 crc kubenswrapper[5184]: I0312 17:22:11.811511 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54654cd4c7-x7588_a9fa8671-f968-45fd-a5bc-fe439e771792/manager/0.log" Mar 12 17:22:11 crc kubenswrapper[5184]: I0312 17:22:11.996167 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-849569668d-fm84v_325b0a39-7766-4c7a-a5b7-c29551f18550/manager/0.log" Mar 12 17:22:12 crc kubenswrapper[5184]: I0312 17:22:12.064460 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-847cdc49c9-7smvt_f6a852ac-a01c-467a-96c7-d65549b557ad/manager/0.log" Mar 12 17:22:12 crc kubenswrapper[5184]: I0312 17:22:12.191331 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-698d4c86bf-qhf48_cab9970b-99b6-4c36-a816-4cbe9ca206f8/manager/0.log" Mar 12 17:22:12 crc kubenswrapper[5184]: I0312 17:22:12.425815 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-785ff4d9b5-jvgbj_ed4f1b6b-e4af-4a56-bb94-5c48640f67ce/manager/0.log" Mar 12 17:22:12 crc kubenswrapper[5184]: I0312 17:22:12.627133 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5f84d557f9-hvp27_6f38ee12-d676-43a1-9f44-d347f24dfbda/manager/0.log" Mar 12 17:22:12 crc kubenswrapper[5184]: I0312 17:22:12.700314 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-8664bfd6f-t7d7f_a149730f-e80a-4abf-8efa-29fb5820c9ae/manager/0.log" Mar 12 17:22:12 crc kubenswrapper[5184]: I0312 17:22:12.991840 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-dffd87f79-nrd8n_97ae1add-72a8-4cfb-8cb4-45b33d39a1b8/manager/0.log" Mar 12 17:22:13 crc kubenswrapper[5184]: I0312 17:22:13.085809 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-79c564bd4f-hcmts_757d5777-73a0-4653-8eba-303a5a8552ec/operator/0.log" Mar 12 17:22:13 crc kubenswrapper[5184]: I0312 17:22:13.332207 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wwk4r_75472062-da1b-4ccc-9522-e4560eb84997/registry-server/0.log" Mar 12 17:22:13 crc kubenswrapper[5184]: I0312 17:22:13.399531 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:22:13 crc kubenswrapper[5184]: E0312 17:22:13.399942 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:22:13 crc kubenswrapper[5184]: I0312 17:22:13.704875 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-8558b89bff-8v768_4fad3169-60c8-49f1-ad8c-ec6fb4282ddc/manager/0.log" Mar 12 17:22:13 crc kubenswrapper[5184]: I0312 17:22:13.770803 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7dd8b74947-8jj2t_c5bf50f4-0265-46ba-98ad-c8c8245664d4/manager/0.log" Mar 12 17:22:13 crc kubenswrapper[5184]: I0312 17:22:13.937706 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-85d9b55b6-v89lf_5717f5e6-3e6e-4585-b9a6-bcc31f707080/operator/0.log" Mar 12 17:22:14 crc kubenswrapper[5184]: I0312 17:22:14.285602 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-865956cc65-vqfsg_fcbfcf8d-d998-4e58-8218-04476a811cf1/manager/0.log" Mar 12 17:22:14 crc kubenswrapper[5184]: I0312 17:22:14.332020 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd977d774-djcpf_94ee0f08-539e-4c3c-a54a-46e35cd20e10/manager/0.log" Mar 12 17:22:14 crc kubenswrapper[5184]: I0312 17:22:14.444771 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-74d567479f-nrs7s_be48e16f-42da-4910-81d4-1c10498247f7/manager/0.log" Mar 12 17:22:14 crc kubenswrapper[5184]: I0312 17:22:14.575535 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-688f7d67f5-ggjlw_8e771b12-3698-427d-a93a-2293244e2171/manager/0.log" Mar 12 17:22:14 crc kubenswrapper[5184]: I0312 17:22:14.744548 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-58ddd4554c-m4npf_8edf3dec-1386-4665-8a1a-ac779905f180/manager/0.log" Mar 12 17:22:16 crc kubenswrapper[5184]: I0312 17:22:16.935017 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-68b4f9dfcc-twj9v_66eb4c90-9578-461e-aaae-6385546ed865/manager/0.log" Mar 12 17:22:20 crc kubenswrapper[5184]: I0312 17:22:20.796659 5184 scope.go:117] "RemoveContainer" containerID="3b86f8065281c734088f1750bdf4a5df82ac5321efe2e8b0e2f29801763c9c22" Mar 12 17:22:20 crc kubenswrapper[5184]: I0312 17:22:20.835961 5184 scope.go:117] "RemoveContainer" containerID="170d47a303bdf2f646b461a158ec79a92ff1a8894b234eae9945aeee2bfb99f0" Mar 12 17:22:25 crc kubenswrapper[5184]: I0312 17:22:25.399858 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:22:25 crc kubenswrapper[5184]: E0312 17:22:25.400705 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:22:33 crc kubenswrapper[5184]: I0312 17:22:33.143160 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-75ffdb6fcd-m7sz7_b6b85787-d6d2-48df-9830-ca4532adee38/control-plane-machine-set-operator/0.log" Mar 12 17:22:33 crc kubenswrapper[5184]: I0312 17:22:33.754825 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-755bb95488-d29hz_36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6/kube-rbac-proxy/0.log" Mar 12 17:22:33 crc kubenswrapper[5184]: I0312 17:22:33.801005 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-755bb95488-d29hz_36cf2a72-16f1-4b2c-9d20-9d1ad0af2ce6/machine-api-operator/0.log" Mar 12 17:22:40 crc kubenswrapper[5184]: I0312 17:22:40.400252 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:22:40 crc kubenswrapper[5184]: E0312 17:22:40.400959 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:22:46 crc kubenswrapper[5184]: I0312 17:22:46.237903 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7b8b89f89d-wb6fn_c25e7c99-ce20-4a74-84fc-fb24c01c931b/cert-manager-controller/0.log" Mar 12 17:22:46 crc kubenswrapper[5184]: I0312 17:22:46.377921 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f9fdd5dd5-hmzwt_0bcd5938-7f0c-4d2d-8f96-ef9933012381/cert-manager-cainjector/0.log" Mar 12 17:22:46 crc kubenswrapper[5184]: I0312 17:22:46.444399 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-769f6b94cb-n4sng_f433a5f6-b857-462d-9896-fcbf044da648/cert-manager-webhook/0.log" Mar 12 17:22:49 crc kubenswrapper[5184]: E0312 17:22:49.119568 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1793854 actualBytes=10240 Mar 12 17:22:51 crc kubenswrapper[5184]: I0312 17:22:51.400470 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:22:51 crc kubenswrapper[5184]: E0312 17:22:51.400930 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:22:59 crc kubenswrapper[5184]: I0312 17:22:59.683528 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-74686bb6b4-ljbdg_b8fc9b5b-64a1-410e-aa92-aec4333f8965/nmstate-console-plugin/0.log" Mar 12 17:22:59 crc kubenswrapper[5184]: I0312 17:22:59.866995 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-288w4_89ad2c51-212e-4a3a-882d-f7c2aeb04a94/nmstate-handler/0.log" Mar 12 17:22:59 crc kubenswrapper[5184]: I0312 17:22:59.883081 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f888697b-qtghx_0aa67d09-261d-4bd2-8341-b81d6d2a3caa/kube-rbac-proxy/0.log" Mar 12 17:22:59 crc kubenswrapper[5184]: I0312 17:22:59.943185 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f888697b-qtghx_0aa67d09-261d-4bd2-8341-b81d6d2a3caa/nmstate-metrics/0.log" Mar 12 17:23:00 crc kubenswrapper[5184]: I0312 17:23:00.021785 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-54b58fcbc5-sdfcf_614e8fcc-0a65-404c-a92c-b0a2834e4d92/nmstate-operator/0.log" Mar 12 17:23:00 crc kubenswrapper[5184]: I0312 17:23:00.146487 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-78fdd78d8b-fxzg8_9e52025d-b566-4965-8664-9d14f8f05dc6/nmstate-webhook/0.log" Mar 12 17:23:02 crc kubenswrapper[5184]: I0312 17:23:02.399909 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:23:02 crc kubenswrapper[5184]: E0312 17:23:02.400966 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:23:14 crc kubenswrapper[5184]: I0312 17:23:14.402664 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:23:14 crc kubenswrapper[5184]: E0312 17:23:14.403626 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:23:27 crc kubenswrapper[5184]: I0312 17:23:27.363979 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-774d88f846-k2qfn_93ee07df-1490-4be4-97d0-f3f5b20ceb90/kube-rbac-proxy/0.log" Mar 12 17:23:27 crc kubenswrapper[5184]: I0312 17:23:27.399506 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:23:27 crc kubenswrapper[5184]: E0312 17:23:27.399848 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:23:27 crc kubenswrapper[5184]: I0312 17:23:27.600110 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-684cb6d9ff-8wmmx_919cbf6b-6f23-47cb-897e-759c6ad20510/manager/0.log" Mar 12 17:23:27 crc kubenswrapper[5184]: I0312 17:23:27.737029 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-774d88f846-k2qfn_93ee07df-1490-4be4-97d0-f3f5b20ceb90/controller/0.log" Mar 12 17:23:27 crc kubenswrapper[5184]: I0312 17:23:27.792440 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6ccc8dfc97-z8s4v_693b4de7-38e7-4b99-8e08-85943be754aa/webhook-server/0.log" Mar 12 17:23:27 crc kubenswrapper[5184]: I0312 17:23:27.853825 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5ktvq_adae6bbe-80fb-4692-b73f-402356ce10c4/kube-rbac-proxy/0.log" Mar 12 17:23:28 crc kubenswrapper[5184]: I0312 17:23:28.262297 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5ktvq_adae6bbe-80fb-4692-b73f-402356ce10c4/speaker/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.093146 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r_8cabbcda-f15b-4907-a210-ec5722d93f79/util/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.293273 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r_8cabbcda-f15b-4907-a210-ec5722d93f79/util/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.312629 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r_8cabbcda-f15b-4907-a210-ec5722d93f79/pull/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.386131 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r_8cabbcda-f15b-4907-a210-ec5722d93f79/pull/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.399789 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:23:41 crc kubenswrapper[5184]: E0312 17:23:41.400207 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.527019 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r_8cabbcda-f15b-4907-a210-ec5722d93f79/util/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.528333 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r_8cabbcda-f15b-4907-a210-ec5722d93f79/extract/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.550616 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5f0d783fea323979717ec4113d968dc42dcfaeaad7ccdc94e6a93c9e91mcl8r_8cabbcda-f15b-4907-a210-ec5722d93f79/pull/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.685180 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mh4c_a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb/extract-utilities/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.870119 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mh4c_a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb/extract-content/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.870649 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mh4c_a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb/extract-utilities/0.log" Mar 12 17:23:41 crc kubenswrapper[5184]: I0312 17:23:41.900165 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mh4c_a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb/extract-content/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.008608 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mh4c_a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb/extract-utilities/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.050951 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mh4c_a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb/extract-content/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.205307 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6zzzp_b7d0b0f3-57cc-443a-85ee-ac686e4f3e52/extract-utilities/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.270329 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mh4c_a2dc3fc5-5eb0-4462-a6d4-8dca698af7fb/registry-server/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.416570 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6zzzp_b7d0b0f3-57cc-443a-85ee-ac686e4f3e52/extract-content/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.425206 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6zzzp_b7d0b0f3-57cc-443a-85ee-ac686e4f3e52/extract-content/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.450110 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6zzzp_b7d0b0f3-57cc-443a-85ee-ac686e4f3e52/extract-utilities/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.569583 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6zzzp_b7d0b0f3-57cc-443a-85ee-ac686e4f3e52/extract-utilities/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.573877 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6zzzp_b7d0b0f3-57cc-443a-85ee-ac686e4f3e52/extract-content/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.804816 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv_929a96db-d440-4095-bfed-bc35b90447eb/util/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.891303 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6zzzp_b7d0b0f3-57cc-443a-85ee-ac686e4f3e52/registry-server/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.933812 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv_929a96db-d440-4095-bfed-bc35b90447eb/util/0.log" Mar 12 17:23:42 crc kubenswrapper[5184]: I0312 17:23:42.944826 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv_929a96db-d440-4095-bfed-bc35b90447eb/pull/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.006143 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv_929a96db-d440-4095-bfed-bc35b90447eb/pull/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.139801 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv_929a96db-d440-4095-bfed-bc35b90447eb/util/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.142097 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv_929a96db-d440-4095-bfed-bc35b90447eb/pull/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.171836 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d1f2cf7c6f4ecb15e69cc0ed07a53b6f169b7a9d46d563b1a9827dff837qpzv_929a96db-d440-4095-bfed-bc35b90447eb/extract/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.322797 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfb2h_9207421a-3f61-4e7a-be60-40549f1f6c99/extract-utilities/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.335402 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-547dbd544d-hpf6l_5ecb4f29-01ec-4c15-8455-30a8b8623f6d/marketplace-operator/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.509707 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfb2h_9207421a-3f61-4e7a-be60-40549f1f6c99/extract-utilities/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.532077 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfb2h_9207421a-3f61-4e7a-be60-40549f1f6c99/extract-content/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.554907 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfb2h_9207421a-3f61-4e7a-be60-40549f1f6c99/extract-content/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.739625 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfb2h_9207421a-3f61-4e7a-be60-40549f1f6c99/extract-utilities/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.764899 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfb2h_9207421a-3f61-4e7a-be60-40549f1f6c99/extract-content/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.839448 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gfb2h_9207421a-3f61-4e7a-be60-40549f1f6c99/registry-server/0.log" Mar 12 17:23:43 crc kubenswrapper[5184]: I0312 17:23:43.941822 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6h45_43fdd0fb-f097-41a9-9160-0be2f8defa9e/extract-utilities/0.log" Mar 12 17:23:44 crc kubenswrapper[5184]: I0312 17:23:44.107085 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6h45_43fdd0fb-f097-41a9-9160-0be2f8defa9e/extract-content/0.log" Mar 12 17:23:44 crc kubenswrapper[5184]: I0312 17:23:44.126746 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6h45_43fdd0fb-f097-41a9-9160-0be2f8defa9e/extract-utilities/0.log" Mar 12 17:23:44 crc kubenswrapper[5184]: I0312 17:23:44.139827 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6h45_43fdd0fb-f097-41a9-9160-0be2f8defa9e/extract-content/0.log" Mar 12 17:23:44 crc kubenswrapper[5184]: I0312 17:23:44.268973 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6h45_43fdd0fb-f097-41a9-9160-0be2f8defa9e/extract-utilities/0.log" Mar 12 17:23:44 crc kubenswrapper[5184]: I0312 17:23:44.318568 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6h45_43fdd0fb-f097-41a9-9160-0be2f8defa9e/extract-content/0.log" Mar 12 17:23:44 crc kubenswrapper[5184]: I0312 17:23:44.485764 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m6h45_43fdd0fb-f097-41a9-9160-0be2f8defa9e/registry-server/0.log" Mar 12 17:23:49 crc kubenswrapper[5184]: E0312 17:23:49.109192 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1793840 actualBytes=10240 Mar 12 17:23:53 crc kubenswrapper[5184]: I0312 17:23:53.401267 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:23:53 crc kubenswrapper[5184]: E0312 17:23:53.402299 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.148617 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555604-wgwvj"] Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.150466 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a50cab-4eef-4f28-a0b3-4df157d40732" containerName="oc" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.150492 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a50cab-4eef-4f28-a0b3-4df157d40732" containerName="oc" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.150811 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0a50cab-4eef-4f28-a0b3-4df157d40732" containerName="oc" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.159186 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555604-wgwvj"] Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.159320 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555604-wgwvj" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.165258 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.165493 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.165622 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.321533 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv8c7\" (UniqueName: \"kubernetes.io/projected/7fd3f0a0-1205-482a-ba6f-c1e753d612f1-kube-api-access-rv8c7\") pod \"auto-csr-approver-29555604-wgwvj\" (UID: \"7fd3f0a0-1205-482a-ba6f-c1e753d612f1\") " pod="openshift-infra/auto-csr-approver-29555604-wgwvj" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.423953 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv8c7\" (UniqueName: \"kubernetes.io/projected/7fd3f0a0-1205-482a-ba6f-c1e753d612f1-kube-api-access-rv8c7\") pod \"auto-csr-approver-29555604-wgwvj\" (UID: \"7fd3f0a0-1205-482a-ba6f-c1e753d612f1\") " pod="openshift-infra/auto-csr-approver-29555604-wgwvj" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.449203 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv8c7\" (UniqueName: \"kubernetes.io/projected/7fd3f0a0-1205-482a-ba6f-c1e753d612f1-kube-api-access-rv8c7\") pod \"auto-csr-approver-29555604-wgwvj\" (UID: \"7fd3f0a0-1205-482a-ba6f-c1e753d612f1\") " pod="openshift-infra/auto-csr-approver-29555604-wgwvj" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.509049 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555604-wgwvj" Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.980661 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555604-wgwvj"] Mar 12 17:24:00 crc kubenswrapper[5184]: I0312 17:24:00.989181 5184 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 17:24:01 crc kubenswrapper[5184]: I0312 17:24:01.853521 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555604-wgwvj" event={"ID":"7fd3f0a0-1205-482a-ba6f-c1e753d612f1","Type":"ContainerStarted","Data":"ce8986c33ac6377f292ff7aae97ea331eec9d8169ba17fc8b0a019e5220fa9ee"} Mar 12 17:24:03 crc kubenswrapper[5184]: I0312 17:24:03.873318 5184 generic.go:358] "Generic (PLEG): container finished" podID="7fd3f0a0-1205-482a-ba6f-c1e753d612f1" containerID="77c1f01df87e2b058a5255c9c74f1deb514ebd184349aabdbde70de1c811d83b" exitCode=0 Mar 12 17:24:03 crc kubenswrapper[5184]: I0312 17:24:03.873522 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555604-wgwvj" event={"ID":"7fd3f0a0-1205-482a-ba6f-c1e753d612f1","Type":"ContainerDied","Data":"77c1f01df87e2b058a5255c9c74f1deb514ebd184349aabdbde70de1c811d83b"} Mar 12 17:24:05 crc kubenswrapper[5184]: I0312 17:24:05.253976 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555604-wgwvj" Mar 12 17:24:05 crc kubenswrapper[5184]: I0312 17:24:05.307463 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv8c7\" (UniqueName: \"kubernetes.io/projected/7fd3f0a0-1205-482a-ba6f-c1e753d612f1-kube-api-access-rv8c7\") pod \"7fd3f0a0-1205-482a-ba6f-c1e753d612f1\" (UID: \"7fd3f0a0-1205-482a-ba6f-c1e753d612f1\") " Mar 12 17:24:05 crc kubenswrapper[5184]: I0312 17:24:05.316575 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd3f0a0-1205-482a-ba6f-c1e753d612f1-kube-api-access-rv8c7" (OuterVolumeSpecName: "kube-api-access-rv8c7") pod "7fd3f0a0-1205-482a-ba6f-c1e753d612f1" (UID: "7fd3f0a0-1205-482a-ba6f-c1e753d612f1"). InnerVolumeSpecName "kube-api-access-rv8c7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:24:05 crc kubenswrapper[5184]: I0312 17:24:05.399394 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:24:05 crc kubenswrapper[5184]: E0312 17:24:05.399796 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:24:05 crc kubenswrapper[5184]: I0312 17:24:05.409578 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rv8c7\" (UniqueName: \"kubernetes.io/projected/7fd3f0a0-1205-482a-ba6f-c1e753d612f1-kube-api-access-rv8c7\") on node \"crc\" DevicePath \"\"" Mar 12 17:24:05 crc kubenswrapper[5184]: I0312 17:24:05.889172 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555604-wgwvj" Mar 12 17:24:05 crc kubenswrapper[5184]: I0312 17:24:05.889207 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555604-wgwvj" event={"ID":"7fd3f0a0-1205-482a-ba6f-c1e753d612f1","Type":"ContainerDied","Data":"ce8986c33ac6377f292ff7aae97ea331eec9d8169ba17fc8b0a019e5220fa9ee"} Mar 12 17:24:05 crc kubenswrapper[5184]: I0312 17:24:05.889266 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce8986c33ac6377f292ff7aae97ea331eec9d8169ba17fc8b0a019e5220fa9ee" Mar 12 17:24:06 crc kubenswrapper[5184]: I0312 17:24:06.325882 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555598-xczd8"] Mar 12 17:24:06 crc kubenswrapper[5184]: I0312 17:24:06.342015 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555598-xczd8"] Mar 12 17:24:06 crc kubenswrapper[5184]: I0312 17:24:06.418840 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a51aa6-a692-479e-a4d1-e8960e7e4e6f" path="/var/lib/kubelet/pods/c1a51aa6-a692-479e-a4d1-e8960e7e4e6f/volumes" Mar 12 17:24:20 crc kubenswrapper[5184]: I0312 17:24:20.399671 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:24:20 crc kubenswrapper[5184]: E0312 17:24:20.400802 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:24:21 crc kubenswrapper[5184]: I0312 17:24:21.024907 5184 scope.go:117] "RemoveContainer" containerID="e8453f3e0c2fb9cd3fbc92886ac68cbe81f26b8b981a10c8204a8dbb8e26308c" Mar 12 17:24:34 crc kubenswrapper[5184]: I0312 17:24:34.400495 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:24:34 crc kubenswrapper[5184]: E0312 17:24:34.401890 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:24:48 crc kubenswrapper[5184]: I0312 17:24:48.412946 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:24:48 crc kubenswrapper[5184]: E0312 17:24:48.413787 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:24:50 crc kubenswrapper[5184]: E0312 17:24:50.774885 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1792493 actualBytes=10240 Mar 12 17:25:00 crc kubenswrapper[5184]: I0312 17:25:00.399734 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:25:00 crc kubenswrapper[5184]: E0312 17:25:00.400624 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:25:12 crc kubenswrapper[5184]: I0312 17:25:12.400446 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:25:12 crc kubenswrapper[5184]: E0312 17:25:12.401618 5184 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cp7pt_openshift-machine-config-operator(7b45c859-3d05-4214-9bd3-2952546f5dea)\"" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" Mar 12 17:25:13 crc kubenswrapper[5184]: I0312 17:25:13.747148 5184 generic.go:358] "Generic (PLEG): container finished" podID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" containerID="e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288" exitCode=0 Mar 12 17:25:13 crc kubenswrapper[5184]: I0312 17:25:13.747263 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t9x9b/must-gather-brf99" event={"ID":"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5","Type":"ContainerDied","Data":"e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288"} Mar 12 17:25:13 crc kubenswrapper[5184]: I0312 17:25:13.748816 5184 scope.go:117] "RemoveContainer" containerID="e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288" Mar 12 17:25:13 crc kubenswrapper[5184]: I0312 17:25:13.918706 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t9x9b_must-gather-brf99_2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5/gather/0.log" Mar 12 17:25:20 crc kubenswrapper[5184]: I0312 17:25:20.804520 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t9x9b/must-gather-brf99"] Mar 12 17:25:20 crc kubenswrapper[5184]: I0312 17:25:20.805559 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-must-gather-t9x9b/must-gather-brf99" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" containerName="copy" containerID="cri-o://766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f" gracePeriod=2 Mar 12 17:25:20 crc kubenswrapper[5184]: I0312 17:25:20.807513 5184 status_manager.go:895] "Failed to get status for pod" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" pod="openshift-must-gather-t9x9b/must-gather-brf99" err="pods \"must-gather-brf99\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-t9x9b\": no relationship found between node 'crc' and this object" Mar 12 17:25:20 crc kubenswrapper[5184]: I0312 17:25:20.814979 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t9x9b/must-gather-brf99"] Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.260558 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t9x9b_must-gather-brf99_2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5/copy/0.log" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.261188 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/must-gather-brf99" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.262833 5184 status_manager.go:895] "Failed to get status for pod" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" pod="openshift-must-gather-t9x9b/must-gather-brf99" err="pods \"must-gather-brf99\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-t9x9b\": no relationship found between node 'crc' and this object" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.352770 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-must-gather-output\") pod \"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5\" (UID: \"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5\") " Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.353119 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfz4h\" (UniqueName: \"kubernetes.io/projected/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-kube-api-access-xfz4h\") pod \"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5\" (UID: \"2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5\") " Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.364951 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-kube-api-access-xfz4h" (OuterVolumeSpecName: "kube-api-access-xfz4h") pod "2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" (UID: "2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5"). InnerVolumeSpecName "kube-api-access-xfz4h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.454971 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfz4h\" (UniqueName: \"kubernetes.io/projected/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-kube-api-access-xfz4h\") on node \"crc\" DevicePath \"\"" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.518959 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" (UID: "2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.563774 5184 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.837392 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t9x9b_must-gather-brf99_2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5/copy/0.log" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.838304 5184 generic.go:358] "Generic (PLEG): container finished" podID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" containerID="766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f" exitCode=143 Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.838595 5184 scope.go:117] "RemoveContainer" containerID="766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.838740 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t9x9b/must-gather-brf99" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.846195 5184 status_manager.go:895] "Failed to get status for pod" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" pod="openshift-must-gather-t9x9b/must-gather-brf99" err="pods \"must-gather-brf99\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-t9x9b\": no relationship found between node 'crc' and this object" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.866776 5184 status_manager.go:895] "Failed to get status for pod" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" pod="openshift-must-gather-t9x9b/must-gather-brf99" err="pods \"must-gather-brf99\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-t9x9b\": no relationship found between node 'crc' and this object" Mar 12 17:25:21 crc kubenswrapper[5184]: I0312 17:25:21.877978 5184 scope.go:117] "RemoveContainer" containerID="e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288" Mar 12 17:25:22 crc kubenswrapper[5184]: I0312 17:25:22.003081 5184 scope.go:117] "RemoveContainer" containerID="766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f" Mar 12 17:25:22 crc kubenswrapper[5184]: E0312 17:25:22.003832 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f\": container with ID starting with 766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f not found: ID does not exist" containerID="766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f" Mar 12 17:25:22 crc kubenswrapper[5184]: I0312 17:25:22.003871 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f"} err="failed to get container status \"766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f\": rpc error: code = NotFound desc = could not find container \"766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f\": container with ID starting with 766a76363474788e69eb48b04a9a3c9ad0e62300ec9fbb8de4365c8633c7b31f not found: ID does not exist" Mar 12 17:25:22 crc kubenswrapper[5184]: I0312 17:25:22.003890 5184 scope.go:117] "RemoveContainer" containerID="e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288" Mar 12 17:25:22 crc kubenswrapper[5184]: E0312 17:25:22.004615 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288\": container with ID starting with e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288 not found: ID does not exist" containerID="e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288" Mar 12 17:25:22 crc kubenswrapper[5184]: I0312 17:25:22.004658 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288"} err="failed to get container status \"e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288\": rpc error: code = NotFound desc = could not find container \"e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288\": container with ID starting with e1f3f8d06e8ce1cb196e7bb209c42ca8445b6c107295754b1eedddb094ce9288 not found: ID does not exist" Mar 12 17:25:22 crc kubenswrapper[5184]: I0312 17:25:22.411772 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" path="/var/lib/kubelet/pods/2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5/volumes" Mar 12 17:25:26 crc kubenswrapper[5184]: I0312 17:25:26.413426 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:25:27 crc kubenswrapper[5184]: I0312 17:25:27.056143 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"cec56539410b4ac15a425e741142d090d0bd99c0ef83dac5cd114a2334674d33"} Mar 12 17:25:49 crc kubenswrapper[5184]: E0312 17:25:49.167523 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1784489 actualBytes=10240 Mar 12 17:25:59 crc kubenswrapper[5184]: I0312 17:25:59.471146 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:25:59 crc kubenswrapper[5184]: I0312 17:25:59.471277 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-99gtj_542903c2-fc88-4085-979a-db3766958392/kube-multus/0.log" Mar 12 17:25:59 crc kubenswrapper[5184]: I0312 17:25:59.480014 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:25:59 crc kubenswrapper[5184]: I0312 17:25:59.481292 5184 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.148159 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555606-zf7rb"] Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.149620 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" containerName="gather" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.149642 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" containerName="gather" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.149690 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fd3f0a0-1205-482a-ba6f-c1e753d612f1" containerName="oc" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.149699 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd3f0a0-1205-482a-ba6f-c1e753d612f1" containerName="oc" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.149758 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" containerName="copy" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.149767 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" containerName="copy" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.149985 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fd3f0a0-1205-482a-ba6f-c1e753d612f1" containerName="oc" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.150018 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" containerName="copy" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.150031 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b8dacc7-9e4a-4fd4-98f7-72a7c5ad78d5" containerName="gather" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.156986 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555606-zf7rb" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.160322 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.160405 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.162224 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.163573 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555606-zf7rb"] Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.254979 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ml9m\" (UniqueName: \"kubernetes.io/projected/9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e-kube-api-access-8ml9m\") pod \"auto-csr-approver-29555606-zf7rb\" (UID: \"9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e\") " pod="openshift-infra/auto-csr-approver-29555606-zf7rb" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.356594 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ml9m\" (UniqueName: \"kubernetes.io/projected/9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e-kube-api-access-8ml9m\") pod \"auto-csr-approver-29555606-zf7rb\" (UID: \"9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e\") " pod="openshift-infra/auto-csr-approver-29555606-zf7rb" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.373658 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ml9m\" (UniqueName: \"kubernetes.io/projected/9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e-kube-api-access-8ml9m\") pod \"auto-csr-approver-29555606-zf7rb\" (UID: \"9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e\") " pod="openshift-infra/auto-csr-approver-29555606-zf7rb" Mar 12 17:26:00 crc kubenswrapper[5184]: I0312 17:26:00.490873 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555606-zf7rb" Mar 12 17:26:01 crc kubenswrapper[5184]: I0312 17:26:01.062899 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555606-zf7rb"] Mar 12 17:26:01 crc kubenswrapper[5184]: I0312 17:26:01.397859 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555606-zf7rb" event={"ID":"9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e","Type":"ContainerStarted","Data":"912185fc75fbf9e76083a275f7d73ceadaba62ca52a0ce4d7a83f244582977fa"} Mar 12 17:26:03 crc kubenswrapper[5184]: I0312 17:26:03.434106 5184 generic.go:358] "Generic (PLEG): container finished" podID="9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e" containerID="8bbf3f1c0e97deb23642316e101bd4706a1497a9192fe6b405e8bcafbad3a2fc" exitCode=0 Mar 12 17:26:03 crc kubenswrapper[5184]: I0312 17:26:03.434987 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555606-zf7rb" event={"ID":"9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e","Type":"ContainerDied","Data":"8bbf3f1c0e97deb23642316e101bd4706a1497a9192fe6b405e8bcafbad3a2fc"} Mar 12 17:26:04 crc kubenswrapper[5184]: I0312 17:26:04.890946 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555606-zf7rb" Mar 12 17:26:04 crc kubenswrapper[5184]: I0312 17:26:04.971971 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ml9m\" (UniqueName: \"kubernetes.io/projected/9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e-kube-api-access-8ml9m\") pod \"9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e\" (UID: \"9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e\") " Mar 12 17:26:04 crc kubenswrapper[5184]: I0312 17:26:04.982625 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e-kube-api-access-8ml9m" (OuterVolumeSpecName: "kube-api-access-8ml9m") pod "9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e" (UID: "9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e"). InnerVolumeSpecName "kube-api-access-8ml9m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:26:05 crc kubenswrapper[5184]: I0312 17:26:05.073636 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8ml9m\" (UniqueName: \"kubernetes.io/projected/9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e-kube-api-access-8ml9m\") on node \"crc\" DevicePath \"\"" Mar 12 17:26:05 crc kubenswrapper[5184]: I0312 17:26:05.456790 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555606-zf7rb" event={"ID":"9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e","Type":"ContainerDied","Data":"912185fc75fbf9e76083a275f7d73ceadaba62ca52a0ce4d7a83f244582977fa"} Mar 12 17:26:05 crc kubenswrapper[5184]: I0312 17:26:05.456845 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="912185fc75fbf9e76083a275f7d73ceadaba62ca52a0ce4d7a83f244582977fa" Mar 12 17:26:05 crc kubenswrapper[5184]: I0312 17:26:05.456917 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555606-zf7rb" Mar 12 17:26:05 crc kubenswrapper[5184]: I0312 17:26:05.976018 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555600-tct96"] Mar 12 17:26:05 crc kubenswrapper[5184]: I0312 17:26:05.990699 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555600-tct96"] Mar 12 17:26:06 crc kubenswrapper[5184]: I0312 17:26:06.412331 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="811de9fa-562c-4bd2-83bd-70d15937198b" path="/var/lib/kubelet/pods/811de9fa-562c-4bd2-83bd-70d15937198b/volumes" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.210457 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5w6zj"] Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.212014 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e" containerName="oc" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.212030 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e" containerName="oc" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.212240 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="9405fdb4-d9b8-4e4b-a2e5-1c2360daec9e" containerName="oc" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.223073 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.227197 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5w6zj"] Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.353164 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-catalog-content\") pod \"redhat-operators-5w6zj\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.353499 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhvkm\" (UniqueName: \"kubernetes.io/projected/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-kube-api-access-jhvkm\") pod \"redhat-operators-5w6zj\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.353636 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-utilities\") pod \"redhat-operators-5w6zj\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.455369 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhvkm\" (UniqueName: \"kubernetes.io/projected/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-kube-api-access-jhvkm\") pod \"redhat-operators-5w6zj\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.455678 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-utilities\") pod \"redhat-operators-5w6zj\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.455976 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-catalog-content\") pod \"redhat-operators-5w6zj\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.456309 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-utilities\") pod \"redhat-operators-5w6zj\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.456449 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-catalog-content\") pod \"redhat-operators-5w6zj\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.480402 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhvkm\" (UniqueName: \"kubernetes.io/projected/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-kube-api-access-jhvkm\") pod \"redhat-operators-5w6zj\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:08 crc kubenswrapper[5184]: I0312 17:26:08.564576 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:09 crc kubenswrapper[5184]: I0312 17:26:09.009983 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5w6zj"] Mar 12 17:26:09 crc kubenswrapper[5184]: I0312 17:26:09.493936 5184 generic.go:358] "Generic (PLEG): container finished" podID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerID="3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805" exitCode=0 Mar 12 17:26:09 crc kubenswrapper[5184]: I0312 17:26:09.494013 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5w6zj" event={"ID":"ba8d63ef-a229-4fb9-b254-3e0e994cfa01","Type":"ContainerDied","Data":"3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805"} Mar 12 17:26:09 crc kubenswrapper[5184]: I0312 17:26:09.494350 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5w6zj" event={"ID":"ba8d63ef-a229-4fb9-b254-3e0e994cfa01","Type":"ContainerStarted","Data":"69ba9238733e3ec8870f7eb5745771e3067d13b18062993002601aaf77eb9199"} Mar 12 17:26:12 crc kubenswrapper[5184]: I0312 17:26:12.525111 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5w6zj" event={"ID":"ba8d63ef-a229-4fb9-b254-3e0e994cfa01","Type":"ContainerStarted","Data":"f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb"} Mar 12 17:26:15 crc kubenswrapper[5184]: I0312 17:26:15.553298 5184 generic.go:358] "Generic (PLEG): container finished" podID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerID="f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb" exitCode=0 Mar 12 17:26:15 crc kubenswrapper[5184]: I0312 17:26:15.553417 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5w6zj" event={"ID":"ba8d63ef-a229-4fb9-b254-3e0e994cfa01","Type":"ContainerDied","Data":"f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb"} Mar 12 17:26:16 crc kubenswrapper[5184]: I0312 17:26:16.569016 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5w6zj" event={"ID":"ba8d63ef-a229-4fb9-b254-3e0e994cfa01","Type":"ContainerStarted","Data":"dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3"} Mar 12 17:26:16 crc kubenswrapper[5184]: I0312 17:26:16.598197 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5w6zj" podStartSLOduration=6.76054232 podStartE2EDuration="8.598165859s" podCreationTimestamp="2026-03-12 17:26:08 +0000 UTC" firstStartedPulling="2026-03-12 17:26:09.494836976 +0000 UTC m=+2112.036148315" lastFinishedPulling="2026-03-12 17:26:11.332460505 +0000 UTC m=+2113.873771854" observedRunningTime="2026-03-12 17:26:16.589391477 +0000 UTC m=+2119.130702826" watchObservedRunningTime="2026-03-12 17:26:16.598165859 +0000 UTC m=+2119.139477218" Mar 12 17:26:18 crc kubenswrapper[5184]: I0312 17:26:18.565078 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:18 crc kubenswrapper[5184]: I0312 17:26:18.565487 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:19 crc kubenswrapper[5184]: I0312 17:26:19.631328 5184 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5w6zj" podUID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerName="registry-server" probeResult="failure" output=< Mar 12 17:26:19 crc kubenswrapper[5184]: timeout: failed to connect service ":50051" within 1s Mar 12 17:26:19 crc kubenswrapper[5184]: > Mar 12 17:26:21 crc kubenswrapper[5184]: I0312 17:26:21.194671 5184 scope.go:117] "RemoveContainer" containerID="d623a21c615df6f0df347c0edea54904aa5387f635eb051e05efe1885f923cf4" Mar 12 17:26:28 crc kubenswrapper[5184]: I0312 17:26:28.619732 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:28 crc kubenswrapper[5184]: I0312 17:26:28.701836 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:28 crc kubenswrapper[5184]: I0312 17:26:28.859239 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5w6zj"] Mar 12 17:26:29 crc kubenswrapper[5184]: I0312 17:26:29.693151 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5w6zj" podUID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerName="registry-server" containerID="cri-o://dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3" gracePeriod=2 Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.170834 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.216988 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-catalog-content\") pod \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.217143 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-utilities\") pod \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.217350 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhvkm\" (UniqueName: \"kubernetes.io/projected/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-kube-api-access-jhvkm\") pod \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\" (UID: \"ba8d63ef-a229-4fb9-b254-3e0e994cfa01\") " Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.218752 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-utilities" (OuterVolumeSpecName: "utilities") pod "ba8d63ef-a229-4fb9-b254-3e0e994cfa01" (UID: "ba8d63ef-a229-4fb9-b254-3e0e994cfa01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.219687 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.230411 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-kube-api-access-jhvkm" (OuterVolumeSpecName: "kube-api-access-jhvkm") pod "ba8d63ef-a229-4fb9-b254-3e0e994cfa01" (UID: "ba8d63ef-a229-4fb9-b254-3e0e994cfa01"). InnerVolumeSpecName "kube-api-access-jhvkm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.320917 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhvkm\" (UniqueName: \"kubernetes.io/projected/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-kube-api-access-jhvkm\") on node \"crc\" DevicePath \"\"" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.364188 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba8d63ef-a229-4fb9-b254-3e0e994cfa01" (UID: "ba8d63ef-a229-4fb9-b254-3e0e994cfa01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.422259 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8d63ef-a229-4fb9-b254-3e0e994cfa01-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 17:26:30 crc kubenswrapper[5184]: E0312 17:26:30.441981 5184 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba8d63ef_a229_4fb9_b254_3e0e994cfa01.slice/crio-69ba9238733e3ec8870f7eb5745771e3067d13b18062993002601aaf77eb9199\": RecentStats: unable to find data in memory cache]" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.708787 5184 generic.go:358] "Generic (PLEG): container finished" podID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerID="dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3" exitCode=0 Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.708847 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5w6zj" event={"ID":"ba8d63ef-a229-4fb9-b254-3e0e994cfa01","Type":"ContainerDied","Data":"dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3"} Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.708949 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5w6zj" event={"ID":"ba8d63ef-a229-4fb9-b254-3e0e994cfa01","Type":"ContainerDied","Data":"69ba9238733e3ec8870f7eb5745771e3067d13b18062993002601aaf77eb9199"} Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.708970 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5w6zj" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.708981 5184 scope.go:117] "RemoveContainer" containerID="dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.740406 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5w6zj"] Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.748278 5184 scope.go:117] "RemoveContainer" containerID="f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.753547 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5w6zj"] Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.776259 5184 scope.go:117] "RemoveContainer" containerID="3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.836336 5184 scope.go:117] "RemoveContainer" containerID="dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3" Mar 12 17:26:30 crc kubenswrapper[5184]: E0312 17:26:30.836834 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3\": container with ID starting with dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3 not found: ID does not exist" containerID="dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.836881 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3"} err="failed to get container status \"dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3\": rpc error: code = NotFound desc = could not find container \"dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3\": container with ID starting with dc1f7caa659c332d45e1dde983976f02856630693cddc811dda1ae7755619dd3 not found: ID does not exist" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.836906 5184 scope.go:117] "RemoveContainer" containerID="f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb" Mar 12 17:26:30 crc kubenswrapper[5184]: E0312 17:26:30.837357 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb\": container with ID starting with f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb not found: ID does not exist" containerID="f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.837506 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb"} err="failed to get container status \"f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb\": rpc error: code = NotFound desc = could not find container \"f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb\": container with ID starting with f12c356fd94f4d073e7db1796a8f016e1f54d4909896e1c90f34621d00e9c9cb not found: ID does not exist" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.837612 5184 scope.go:117] "RemoveContainer" containerID="3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805" Mar 12 17:26:30 crc kubenswrapper[5184]: E0312 17:26:30.838110 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805\": container with ID starting with 3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805 not found: ID does not exist" containerID="3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805" Mar 12 17:26:30 crc kubenswrapper[5184]: I0312 17:26:30.838161 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805"} err="failed to get container status \"3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805\": rpc error: code = NotFound desc = could not find container \"3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805\": container with ID starting with 3c81153509bae5a3f26ef40d3dec56d54632a7b98955a94e822cfd7efcef2805 not found: ID does not exist" Mar 12 17:26:32 crc kubenswrapper[5184]: I0312 17:26:32.413719 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" path="/var/lib/kubelet/pods/ba8d63ef-a229-4fb9-b254-3e0e994cfa01/volumes" Mar 12 17:26:49 crc kubenswrapper[5184]: E0312 17:26:49.338207 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1784490 actualBytes=10240 Mar 12 17:27:49 crc kubenswrapper[5184]: E0312 17:27:49.298333 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1784493 actualBytes=10240 Mar 12 17:27:50 crc kubenswrapper[5184]: I0312 17:27:50.742841 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:27:50 crc kubenswrapper[5184]: I0312 17:27:50.742925 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.158183 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555608-g5s2x"] Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.160244 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerName="extract-content" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.160265 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerName="extract-content" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.160285 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerName="registry-server" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.160293 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerName="registry-server" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.160334 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerName="extract-utilities" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.160345 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerName="extract-utilities" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.160593 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba8d63ef-a229-4fb9-b254-3e0e994cfa01" containerName="registry-server" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.168743 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555608-g5s2x"] Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.168861 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555608-g5s2x" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.175017 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.176179 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.176259 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.208037 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq6td\" (UniqueName: \"kubernetes.io/projected/fd23cfd3-82c4-43ac-ae06-ba7c72ac9415-kube-api-access-lq6td\") pod \"auto-csr-approver-29555608-g5s2x\" (UID: \"fd23cfd3-82c4-43ac-ae06-ba7c72ac9415\") " pod="openshift-infra/auto-csr-approver-29555608-g5s2x" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.309514 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq6td\" (UniqueName: \"kubernetes.io/projected/fd23cfd3-82c4-43ac-ae06-ba7c72ac9415-kube-api-access-lq6td\") pod \"auto-csr-approver-29555608-g5s2x\" (UID: \"fd23cfd3-82c4-43ac-ae06-ba7c72ac9415\") " pod="openshift-infra/auto-csr-approver-29555608-g5s2x" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.332684 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq6td\" (UniqueName: \"kubernetes.io/projected/fd23cfd3-82c4-43ac-ae06-ba7c72ac9415-kube-api-access-lq6td\") pod \"auto-csr-approver-29555608-g5s2x\" (UID: \"fd23cfd3-82c4-43ac-ae06-ba7c72ac9415\") " pod="openshift-infra/auto-csr-approver-29555608-g5s2x" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.491973 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555608-g5s2x" Mar 12 17:28:00 crc kubenswrapper[5184]: I0312 17:28:00.982267 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555608-g5s2x"] Mar 12 17:28:02 crc kubenswrapper[5184]: I0312 17:28:02.022540 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555608-g5s2x" event={"ID":"fd23cfd3-82c4-43ac-ae06-ba7c72ac9415","Type":"ContainerStarted","Data":"1280778ee72c1451683e074fc91ed2cd1a377a492b77f24c26898eb1a186bec1"} Mar 12 17:28:03 crc kubenswrapper[5184]: I0312 17:28:03.037352 5184 generic.go:358] "Generic (PLEG): container finished" podID="fd23cfd3-82c4-43ac-ae06-ba7c72ac9415" containerID="edba1bde3b8f24f285cea2441010404c39b061b45bdd9d9e5eb2cb1f2e775a01" exitCode=0 Mar 12 17:28:03 crc kubenswrapper[5184]: I0312 17:28:03.037661 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555608-g5s2x" event={"ID":"fd23cfd3-82c4-43ac-ae06-ba7c72ac9415","Type":"ContainerDied","Data":"edba1bde3b8f24f285cea2441010404c39b061b45bdd9d9e5eb2cb1f2e775a01"} Mar 12 17:28:04 crc kubenswrapper[5184]: I0312 17:28:04.476810 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555608-g5s2x" Mar 12 17:28:04 crc kubenswrapper[5184]: I0312 17:28:04.496796 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq6td\" (UniqueName: \"kubernetes.io/projected/fd23cfd3-82c4-43ac-ae06-ba7c72ac9415-kube-api-access-lq6td\") pod \"fd23cfd3-82c4-43ac-ae06-ba7c72ac9415\" (UID: \"fd23cfd3-82c4-43ac-ae06-ba7c72ac9415\") " Mar 12 17:28:04 crc kubenswrapper[5184]: I0312 17:28:04.507639 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd23cfd3-82c4-43ac-ae06-ba7c72ac9415-kube-api-access-lq6td" (OuterVolumeSpecName: "kube-api-access-lq6td") pod "fd23cfd3-82c4-43ac-ae06-ba7c72ac9415" (UID: "fd23cfd3-82c4-43ac-ae06-ba7c72ac9415"). InnerVolumeSpecName "kube-api-access-lq6td". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:28:04 crc kubenswrapper[5184]: I0312 17:28:04.599495 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lq6td\" (UniqueName: \"kubernetes.io/projected/fd23cfd3-82c4-43ac-ae06-ba7c72ac9415-kube-api-access-lq6td\") on node \"crc\" DevicePath \"\"" Mar 12 17:28:05 crc kubenswrapper[5184]: I0312 17:28:05.058617 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555608-g5s2x" Mar 12 17:28:05 crc kubenswrapper[5184]: I0312 17:28:05.058704 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555608-g5s2x" event={"ID":"fd23cfd3-82c4-43ac-ae06-ba7c72ac9415","Type":"ContainerDied","Data":"1280778ee72c1451683e074fc91ed2cd1a377a492b77f24c26898eb1a186bec1"} Mar 12 17:28:05 crc kubenswrapper[5184]: I0312 17:28:05.058763 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1280778ee72c1451683e074fc91ed2cd1a377a492b77f24c26898eb1a186bec1" Mar 12 17:28:05 crc kubenswrapper[5184]: I0312 17:28:05.563901 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555602-tsg2b"] Mar 12 17:28:05 crc kubenswrapper[5184]: I0312 17:28:05.573882 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555602-tsg2b"] Mar 12 17:28:06 crc kubenswrapper[5184]: I0312 17:28:06.422101 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a50cab-4eef-4f28-a0b3-4df157d40732" path="/var/lib/kubelet/pods/d0a50cab-4eef-4f28-a0b3-4df157d40732/volumes" Mar 12 17:28:20 crc kubenswrapper[5184]: I0312 17:28:20.741942 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:28:20 crc kubenswrapper[5184]: I0312 17:28:20.742663 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:28:21 crc kubenswrapper[5184]: I0312 17:28:21.384292 5184 scope.go:117] "RemoveContainer" containerID="5e073420f15bb5280368407b650d7ff45e5c46a435a13acc813dbc396ce93ffd" Mar 12 17:28:49 crc kubenswrapper[5184]: E0312 17:28:49.360316 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1784474 actualBytes=10240 Mar 12 17:28:50 crc kubenswrapper[5184]: I0312 17:28:50.742296 5184 patch_prober.go:28] interesting pod/machine-config-daemon-cp7pt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 17:28:50 crc kubenswrapper[5184]: I0312 17:28:50.742498 5184 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 17:28:50 crc kubenswrapper[5184]: I0312 17:28:50.742601 5184 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" Mar 12 17:28:50 crc kubenswrapper[5184]: I0312 17:28:50.743685 5184 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cec56539410b4ac15a425e741142d090d0bd99c0ef83dac5cd114a2334674d33"} pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 17:28:50 crc kubenswrapper[5184]: I0312 17:28:50.743806 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" podUID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerName="machine-config-daemon" containerID="cri-o://cec56539410b4ac15a425e741142d090d0bd99c0ef83dac5cd114a2334674d33" gracePeriod=600 Mar 12 17:28:51 crc kubenswrapper[5184]: I0312 17:28:51.028636 5184 generic.go:358] "Generic (PLEG): container finished" podID="7b45c859-3d05-4214-9bd3-2952546f5dea" containerID="cec56539410b4ac15a425e741142d090d0bd99c0ef83dac5cd114a2334674d33" exitCode=0 Mar 12 17:28:51 crc kubenswrapper[5184]: I0312 17:28:51.028700 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerDied","Data":"cec56539410b4ac15a425e741142d090d0bd99c0ef83dac5cd114a2334674d33"} Mar 12 17:28:51 crc kubenswrapper[5184]: I0312 17:28:51.029181 5184 scope.go:117] "RemoveContainer" containerID="003716e1434d36a7e89bad17d4bfd64463f69f9907a5c9319c56e5b94d17d924" Mar 12 17:28:52 crc kubenswrapper[5184]: I0312 17:28:52.048321 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cp7pt" event={"ID":"7b45c859-3d05-4214-9bd3-2952546f5dea","Type":"ContainerStarted","Data":"c3dff867f08ca3c7327b13981333f190876162c68a742778c7bfd764532374bd"} Mar 12 17:29:49 crc kubenswrapper[5184]: E0312 17:29:49.120819 5184 prober.go:256] "Unable to write all bytes from execInContainer" err="short write" expectedBytes=1785799 actualBytes=10240 Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.168785 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d9jgg"] Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.175046 5184 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd23cfd3-82c4-43ac-ae06-ba7c72ac9415" containerName="oc" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.175078 5184 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd23cfd3-82c4-43ac-ae06-ba7c72ac9415" containerName="oc" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.175585 5184 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd23cfd3-82c4-43ac-ae06-ba7c72ac9415" containerName="oc" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.194576 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.216609 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d9jgg"] Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.374747 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnz2d\" (UniqueName: \"kubernetes.io/projected/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-kube-api-access-mnz2d\") pod \"community-operators-d9jgg\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.374834 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-utilities\") pod \"community-operators-d9jgg\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.375056 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-catalog-content\") pod \"community-operators-d9jgg\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.476658 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnz2d\" (UniqueName: \"kubernetes.io/projected/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-kube-api-access-mnz2d\") pod \"community-operators-d9jgg\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.476800 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-utilities\") pod \"community-operators-d9jgg\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.477050 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-catalog-content\") pod \"community-operators-d9jgg\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.477493 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-utilities\") pod \"community-operators-d9jgg\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.477909 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-catalog-content\") pod \"community-operators-d9jgg\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.498279 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnz2d\" (UniqueName: \"kubernetes.io/projected/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-kube-api-access-mnz2d\") pod \"community-operators-d9jgg\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:55 crc kubenswrapper[5184]: I0312 17:29:55.525364 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:29:56 crc kubenswrapper[5184]: W0312 17:29:56.071018 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85e963c2_2a42_4dea_9e41_27ebc9ccc1cf.slice/crio-bd27ee36acb811e38cf71f44ac598f16b66d50d29f37bdd2414e0f62868a2f04 WatchSource:0}: Error finding container bd27ee36acb811e38cf71f44ac598f16b66d50d29f37bdd2414e0f62868a2f04: Status 404 returned error can't find the container with id bd27ee36acb811e38cf71f44ac598f16b66d50d29f37bdd2414e0f62868a2f04 Mar 12 17:29:56 crc kubenswrapper[5184]: I0312 17:29:56.072100 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d9jgg"] Mar 12 17:29:56 crc kubenswrapper[5184]: I0312 17:29:56.074658 5184 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 17:29:56 crc kubenswrapper[5184]: I0312 17:29:56.823479 5184 generic.go:358] "Generic (PLEG): container finished" podID="85e963c2-2a42-4dea-9e41-27ebc9ccc1cf" containerID="3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565" exitCode=0 Mar 12 17:29:56 crc kubenswrapper[5184]: I0312 17:29:56.823570 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9jgg" event={"ID":"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf","Type":"ContainerDied","Data":"3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565"} Mar 12 17:29:56 crc kubenswrapper[5184]: I0312 17:29:56.823895 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9jgg" event={"ID":"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf","Type":"ContainerStarted","Data":"bd27ee36acb811e38cf71f44ac598f16b66d50d29f37bdd2414e0f62868a2f04"} Mar 12 17:29:58 crc kubenswrapper[5184]: I0312 17:29:58.843065 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9jgg" event={"ID":"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf","Type":"ContainerStarted","Data":"b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd"} Mar 12 17:29:59 crc kubenswrapper[5184]: I0312 17:29:59.857773 5184 generic.go:358] "Generic (PLEG): container finished" podID="85e963c2-2a42-4dea-9e41-27ebc9ccc1cf" containerID="b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd" exitCode=0 Mar 12 17:29:59 crc kubenswrapper[5184]: I0312 17:29:59.857859 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9jgg" event={"ID":"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf","Type":"ContainerDied","Data":"b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd"} Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.171947 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv"] Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.182276 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.189121 5184 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555610-p9hz4"] Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.193169 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.193284 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.201828 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555610-p9hz4" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.204747 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.205688 5184 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-f4gpz\"" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.207354 5184 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.209668 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv"] Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.220298 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555610-p9hz4"] Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.286779 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfvp2\" (UniqueName: \"kubernetes.io/projected/8f5cd288-09e4-4528-bb97-9ffee9c21ce0-kube-api-access-qfvp2\") pod \"auto-csr-approver-29555610-p9hz4\" (UID: \"8f5cd288-09e4-4528-bb97-9ffee9c21ce0\") " pod="openshift-infra/auto-csr-approver-29555610-p9hz4" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.286828 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf7c4d56-a5e5-462c-9a87-b058d628ae73-secret-volume\") pod \"collect-profiles-29555610-4dpbv\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.286884 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf7c4d56-a5e5-462c-9a87-b058d628ae73-config-volume\") pod \"collect-profiles-29555610-4dpbv\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.287034 5184 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8sxc\" (UniqueName: \"kubernetes.io/projected/cf7c4d56-a5e5-462c-9a87-b058d628ae73-kube-api-access-n8sxc\") pod \"collect-profiles-29555610-4dpbv\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.393451 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf7c4d56-a5e5-462c-9a87-b058d628ae73-secret-volume\") pod \"collect-profiles-29555610-4dpbv\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.411896 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf7c4d56-a5e5-462c-9a87-b058d628ae73-config-volume\") pod \"collect-profiles-29555610-4dpbv\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.412559 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8sxc\" (UniqueName: \"kubernetes.io/projected/cf7c4d56-a5e5-462c-9a87-b058d628ae73-kube-api-access-n8sxc\") pod \"collect-profiles-29555610-4dpbv\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.412675 5184 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfvp2\" (UniqueName: \"kubernetes.io/projected/8f5cd288-09e4-4528-bb97-9ffee9c21ce0-kube-api-access-qfvp2\") pod \"auto-csr-approver-29555610-p9hz4\" (UID: \"8f5cd288-09e4-4528-bb97-9ffee9c21ce0\") " pod="openshift-infra/auto-csr-approver-29555610-p9hz4" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.431030 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf7c4d56-a5e5-462c-9a87-b058d628ae73-config-volume\") pod \"collect-profiles-29555610-4dpbv\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.436540 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf7c4d56-a5e5-462c-9a87-b058d628ae73-secret-volume\") pod \"collect-profiles-29555610-4dpbv\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.436767 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfvp2\" (UniqueName: \"kubernetes.io/projected/8f5cd288-09e4-4528-bb97-9ffee9c21ce0-kube-api-access-qfvp2\") pod \"auto-csr-approver-29555610-p9hz4\" (UID: \"8f5cd288-09e4-4528-bb97-9ffee9c21ce0\") " pod="openshift-infra/auto-csr-approver-29555610-p9hz4" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.444180 5184 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8sxc\" (UniqueName: \"kubernetes.io/projected/cf7c4d56-a5e5-462c-9a87-b058d628ae73-kube-api-access-n8sxc\") pod \"collect-profiles-29555610-4dpbv\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.572648 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.583630 5184 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555610-p9hz4" Mar 12 17:30:00 crc kubenswrapper[5184]: I0312 17:30:00.872830 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9jgg" event={"ID":"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf","Type":"ContainerStarted","Data":"21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f"} Mar 12 17:30:01 crc kubenswrapper[5184]: I0312 17:30:01.040509 5184 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d9jgg" podStartSLOduration=5.019425508 podStartE2EDuration="6.040489807s" podCreationTimestamp="2026-03-12 17:29:55 +0000 UTC" firstStartedPulling="2026-03-12 17:29:56.824738775 +0000 UTC m=+2339.366050114" lastFinishedPulling="2026-03-12 17:29:57.845803064 +0000 UTC m=+2340.387114413" observedRunningTime="2026-03-12 17:30:00.893931011 +0000 UTC m=+2343.435242350" watchObservedRunningTime="2026-03-12 17:30:01.040489807 +0000 UTC m=+2343.581801146" Mar 12 17:30:01 crc kubenswrapper[5184]: I0312 17:30:01.041042 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555610-p9hz4"] Mar 12 17:30:01 crc kubenswrapper[5184]: I0312 17:30:01.134685 5184 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv"] Mar 12 17:30:01 crc kubenswrapper[5184]: W0312 17:30:01.140982 5184 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf7c4d56_a5e5_462c_9a87_b058d628ae73.slice/crio-b3988329122a375c08e00f542a862d0d93731126d99b118637ba8f85fafe5d63 WatchSource:0}: Error finding container b3988329122a375c08e00f542a862d0d93731126d99b118637ba8f85fafe5d63: Status 404 returned error can't find the container with id b3988329122a375c08e00f542a862d0d93731126d99b118637ba8f85fafe5d63 Mar 12 17:30:01 crc kubenswrapper[5184]: I0312 17:30:01.892368 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555610-p9hz4" event={"ID":"8f5cd288-09e4-4528-bb97-9ffee9c21ce0","Type":"ContainerStarted","Data":"bb6c5e399b62c5ea7fbbc13f108c113b883f0fce523fcbacc3191c9b07e9fbc2"} Mar 12 17:30:01 crc kubenswrapper[5184]: I0312 17:30:01.901222 5184 generic.go:358] "Generic (PLEG): container finished" podID="cf7c4d56-a5e5-462c-9a87-b058d628ae73" containerID="f40e87967ce2c6b59c5353590fca0c0d4de42f72a495925d6e42fb93d7ccb30b" exitCode=0 Mar 12 17:30:01 crc kubenswrapper[5184]: I0312 17:30:01.902495 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" event={"ID":"cf7c4d56-a5e5-462c-9a87-b058d628ae73","Type":"ContainerDied","Data":"f40e87967ce2c6b59c5353590fca0c0d4de42f72a495925d6e42fb93d7ccb30b"} Mar 12 17:30:01 crc kubenswrapper[5184]: I0312 17:30:01.902576 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" event={"ID":"cf7c4d56-a5e5-462c-9a87-b058d628ae73","Type":"ContainerStarted","Data":"b3988329122a375c08e00f542a862d0d93731126d99b118637ba8f85fafe5d63"} Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.293318 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.371993 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf7c4d56-a5e5-462c-9a87-b058d628ae73-config-volume\") pod \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.372071 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8sxc\" (UniqueName: \"kubernetes.io/projected/cf7c4d56-a5e5-462c-9a87-b058d628ae73-kube-api-access-n8sxc\") pod \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.372119 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf7c4d56-a5e5-462c-9a87-b058d628ae73-secret-volume\") pod \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\" (UID: \"cf7c4d56-a5e5-462c-9a87-b058d628ae73\") " Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.372915 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf7c4d56-a5e5-462c-9a87-b058d628ae73-config-volume" (OuterVolumeSpecName: "config-volume") pod "cf7c4d56-a5e5-462c-9a87-b058d628ae73" (UID: "cf7c4d56-a5e5-462c-9a87-b058d628ae73"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.381609 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7c4d56-a5e5-462c-9a87-b058d628ae73-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cf7c4d56-a5e5-462c-9a87-b058d628ae73" (UID: "cf7c4d56-a5e5-462c-9a87-b058d628ae73"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.381670 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7c4d56-a5e5-462c-9a87-b058d628ae73-kube-api-access-n8sxc" (OuterVolumeSpecName: "kube-api-access-n8sxc") pod "cf7c4d56-a5e5-462c-9a87-b058d628ae73" (UID: "cf7c4d56-a5e5-462c-9a87-b058d628ae73"). InnerVolumeSpecName "kube-api-access-n8sxc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.474417 5184 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf7c4d56-a5e5-462c-9a87-b058d628ae73-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.474454 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n8sxc\" (UniqueName: \"kubernetes.io/projected/cf7c4d56-a5e5-462c-9a87-b058d628ae73-kube-api-access-n8sxc\") on node \"crc\" DevicePath \"\"" Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.474469 5184 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf7c4d56-a5e5-462c-9a87-b058d628ae73-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.918859 5184 generic.go:358] "Generic (PLEG): container finished" podID="8f5cd288-09e4-4528-bb97-9ffee9c21ce0" containerID="560d93d9c2e33ad62d1652cbdd2f06ca2021f408a1f6851960e1effa748e5669" exitCode=0 Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.919132 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555610-p9hz4" event={"ID":"8f5cd288-09e4-4528-bb97-9ffee9c21ce0","Type":"ContainerDied","Data":"560d93d9c2e33ad62d1652cbdd2f06ca2021f408a1f6851960e1effa748e5669"} Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.920991 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.921134 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555610-4dpbv" event={"ID":"cf7c4d56-a5e5-462c-9a87-b058d628ae73","Type":"ContainerDied","Data":"b3988329122a375c08e00f542a862d0d93731126d99b118637ba8f85fafe5d63"} Mar 12 17:30:03 crc kubenswrapper[5184]: I0312 17:30:03.921164 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3988329122a375c08e00f542a862d0d93731126d99b118637ba8f85fafe5d63" Mar 12 17:30:04 crc kubenswrapper[5184]: I0312 17:30:04.379885 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz"] Mar 12 17:30:04 crc kubenswrapper[5184]: I0312 17:30:04.389899 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555565-ms5vz"] Mar 12 17:30:04 crc kubenswrapper[5184]: I0312 17:30:04.410747 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea329fb-a095-4645-b64f-a5769efa6364" path="/var/lib/kubelet/pods/8ea329fb-a095-4645-b64f-a5769efa6364/volumes" Mar 12 17:30:05 crc kubenswrapper[5184]: I0312 17:30:05.350445 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555610-p9hz4" Mar 12 17:30:05 crc kubenswrapper[5184]: I0312 17:30:05.525793 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfvp2\" (UniqueName: \"kubernetes.io/projected/8f5cd288-09e4-4528-bb97-9ffee9c21ce0-kube-api-access-qfvp2\") pod \"8f5cd288-09e4-4528-bb97-9ffee9c21ce0\" (UID: \"8f5cd288-09e4-4528-bb97-9ffee9c21ce0\") " Mar 12 17:30:05 crc kubenswrapper[5184]: I0312 17:30:05.528068 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:30:05 crc kubenswrapper[5184]: I0312 17:30:05.528124 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:30:05 crc kubenswrapper[5184]: I0312 17:30:05.539287 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5cd288-09e4-4528-bb97-9ffee9c21ce0-kube-api-access-qfvp2" (OuterVolumeSpecName: "kube-api-access-qfvp2") pod "8f5cd288-09e4-4528-bb97-9ffee9c21ce0" (UID: "8f5cd288-09e4-4528-bb97-9ffee9c21ce0"). InnerVolumeSpecName "kube-api-access-qfvp2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:30:05 crc kubenswrapper[5184]: I0312 17:30:05.588004 5184 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:30:05 crc kubenswrapper[5184]: I0312 17:30:05.633119 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qfvp2\" (UniqueName: \"kubernetes.io/projected/8f5cd288-09e4-4528-bb97-9ffee9c21ce0-kube-api-access-qfvp2\") on node \"crc\" DevicePath \"\"" Mar 12 17:30:05 crc kubenswrapper[5184]: I0312 17:30:05.949809 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555610-p9hz4" Mar 12 17:30:05 crc kubenswrapper[5184]: I0312 17:30:05.955305 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555610-p9hz4" event={"ID":"8f5cd288-09e4-4528-bb97-9ffee9c21ce0","Type":"ContainerDied","Data":"bb6c5e399b62c5ea7fbbc13f108c113b883f0fce523fcbacc3191c9b07e9fbc2"} Mar 12 17:30:05 crc kubenswrapper[5184]: I0312 17:30:05.955347 5184 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb6c5e399b62c5ea7fbbc13f108c113b883f0fce523fcbacc3191c9b07e9fbc2" Mar 12 17:30:06 crc kubenswrapper[5184]: I0312 17:30:06.032935 5184 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:30:06 crc kubenswrapper[5184]: I0312 17:30:06.101178 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d9jgg"] Mar 12 17:30:06 crc kubenswrapper[5184]: I0312 17:30:06.413715 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555604-wgwvj"] Mar 12 17:30:06 crc kubenswrapper[5184]: I0312 17:30:06.422613 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555604-wgwvj"] Mar 12 17:30:07 crc kubenswrapper[5184]: I0312 17:30:07.973612 5184 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d9jgg" podUID="85e963c2-2a42-4dea-9e41-27ebc9ccc1cf" containerName="registry-server" containerID="cri-o://21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f" gracePeriod=2 Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.412648 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fd3f0a0-1205-482a-ba6f-c1e753d612f1" path="/var/lib/kubelet/pods/7fd3f0a0-1205-482a-ba6f-c1e753d612f1/volumes" Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.468304 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.492804 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnz2d\" (UniqueName: \"kubernetes.io/projected/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-kube-api-access-mnz2d\") pod \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.493766 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-catalog-content\") pod \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.496581 5184 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-utilities\") pod \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\" (UID: \"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf\") " Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.498055 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-utilities" (OuterVolumeSpecName: "utilities") pod "85e963c2-2a42-4dea-9e41-27ebc9ccc1cf" (UID: "85e963c2-2a42-4dea-9e41-27ebc9ccc1cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.503129 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-kube-api-access-mnz2d" (OuterVolumeSpecName: "kube-api-access-mnz2d") pod "85e963c2-2a42-4dea-9e41-27ebc9ccc1cf" (UID: "85e963c2-2a42-4dea-9e41-27ebc9ccc1cf"). InnerVolumeSpecName "kube-api-access-mnz2d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.552520 5184 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85e963c2-2a42-4dea-9e41-27ebc9ccc1cf" (UID: "85e963c2-2a42-4dea-9e41-27ebc9ccc1cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.599230 5184 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.599303 5184 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mnz2d\" (UniqueName: \"kubernetes.io/projected/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-kube-api-access-mnz2d\") on node \"crc\" DevicePath \"\"" Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.599316 5184 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.987449 5184 generic.go:358] "Generic (PLEG): container finished" podID="85e963c2-2a42-4dea-9e41-27ebc9ccc1cf" containerID="21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f" exitCode=0 Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.987581 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9jgg" event={"ID":"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf","Type":"ContainerDied","Data":"21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f"} Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.987636 5184 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d9jgg" Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.987666 5184 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d9jgg" event={"ID":"85e963c2-2a42-4dea-9e41-27ebc9ccc1cf","Type":"ContainerDied","Data":"bd27ee36acb811e38cf71f44ac598f16b66d50d29f37bdd2414e0f62868a2f04"} Mar 12 17:30:08 crc kubenswrapper[5184]: I0312 17:30:08.987698 5184 scope.go:117] "RemoveContainer" containerID="21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f" Mar 12 17:30:09 crc kubenswrapper[5184]: I0312 17:30:09.012428 5184 scope.go:117] "RemoveContainer" containerID="b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd" Mar 12 17:30:09 crc kubenswrapper[5184]: I0312 17:30:09.031810 5184 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d9jgg"] Mar 12 17:30:09 crc kubenswrapper[5184]: I0312 17:30:09.042431 5184 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d9jgg"] Mar 12 17:30:09 crc kubenswrapper[5184]: I0312 17:30:09.053983 5184 scope.go:117] "RemoveContainer" containerID="3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565" Mar 12 17:30:09 crc kubenswrapper[5184]: I0312 17:30:09.090950 5184 scope.go:117] "RemoveContainer" containerID="21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f" Mar 12 17:30:09 crc kubenswrapper[5184]: E0312 17:30:09.091460 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f\": container with ID starting with 21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f not found: ID does not exist" containerID="21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f" Mar 12 17:30:09 crc kubenswrapper[5184]: I0312 17:30:09.091503 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f"} err="failed to get container status \"21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f\": rpc error: code = NotFound desc = could not find container \"21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f\": container with ID starting with 21d772aba89f1b6b0159527c1d33f561e436a2937d623b9e01b95386a4916d2f not found: ID does not exist" Mar 12 17:30:09 crc kubenswrapper[5184]: I0312 17:30:09.091525 5184 scope.go:117] "RemoveContainer" containerID="b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd" Mar 12 17:30:09 crc kubenswrapper[5184]: E0312 17:30:09.091930 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd\": container with ID starting with b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd not found: ID does not exist" containerID="b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd" Mar 12 17:30:09 crc kubenswrapper[5184]: I0312 17:30:09.091955 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd"} err="failed to get container status \"b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd\": rpc error: code = NotFound desc = could not find container \"b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd\": container with ID starting with b288f719090d4d088ee04e0f943cf66a53addddc258bc853e23d2d57cbd066cd not found: ID does not exist" Mar 12 17:30:09 crc kubenswrapper[5184]: I0312 17:30:09.091973 5184 scope.go:117] "RemoveContainer" containerID="3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565" Mar 12 17:30:09 crc kubenswrapper[5184]: E0312 17:30:09.092172 5184 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565\": container with ID starting with 3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565 not found: ID does not exist" containerID="3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565" Mar 12 17:30:09 crc kubenswrapper[5184]: I0312 17:30:09.092200 5184 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565"} err="failed to get container status \"3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565\": rpc error: code = NotFound desc = could not find container \"3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565\": container with ID starting with 3be2bc82586d2c679b1947d688c8d90d4c8cca9792e3dff1606f2f7829f43565 not found: ID does not exist" Mar 12 17:30:10 crc kubenswrapper[5184]: I0312 17:30:10.413486 5184 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e963c2-2a42-4dea-9e41-27ebc9ccc1cf" path="/var/lib/kubelet/pods/85e963c2-2a42-4dea-9e41-27ebc9ccc1cf/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515154574061024455 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015154574062017373 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015154566731016522 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015154566731015472 5ustar corecore